History of Computing as History of Technology part:- 5

 History of Computing as History of
Technology


History of Computing as History of Technology
History of Computing as History of
Technology



PART :- 5

 About this part :-

History of Computing as History of

Technology



Consider, then, the history of computing in
light of current history of technology. Several
lines of inquiry seem particularly promising.
Studies such as those cited above offer a
panoply of models for tracing the patterns of
growth and progress in computing as a
technology. It is worth asking, for example,
whether the computing industry has moved
forward more by big advances of radical
innovation or by small steps of improvement. Has it followed the process described by Nathan Rosenberg, whereby "... technological improvement not only enters the structure of The economy through the main entrance, as when it takes the highly visible form of major patentable technological breakthroughs, but
that it also employs numerous and less visible side and rear entrances where its arrival is unobtrusive, unannounced, unobserved, and uncelebrated" (Rosenberg 1979, p.26)? To determine whether that is the case will require changes in the history of computing as it is currently practiced. It will mean looking beyond "firsts" to the revisions and modifications that made products work and that account for their real impact. Given the corporate, collaborative structure of modern R&D, historians of computing must follow the admonition once made to historians of technology to stop "substituting biography for careful analysis of social processes". Without denigrating the role of heroes and pioneers, we need more knowledge of computing's equivalent of "shop practices, [and of] the activities of lower-level technicians in factories" (Daniels 1970, p.11). The question is how to pursue that inquiry across the variegated range of the emerging industry.



Viewing computing both as a system
in itself and as a component of a variety of
larger systems may provide important insights into the the dynamics of its development and may help to distinguish between its internal and its external history. For example, it suggests an approach to the question of the relation between hardware and software, often couched in the antagonistic form of one driving the other, a form which seems to assume that the two are relatively independent of one another. By contrast, linking them in a system emphasizes their mutual dependence.
One expects of a system that the relationship
among its internal components and their
relationships to external components will vary over time and place but that they will do so in a way that maintains a certain equilibrium or homeostasis, even as the system itself evolves. Seen in that light, the relation between hardware and software is a question not so much of driving forces, or of stimulus and response, as of constraints and degrees of freedom. While in principle all computers have the same capacities as universal Turing machines, in practice different architectures are conducive to different forms of computing.
Certain architectures have technical thresholds (e.g. VSLI is a prerequisite to massively parallel computing), others reflect conscious choices among equally feasible alternatives; some have been influenced by the needs and concerns of software production, others by the special purposes of customers. Early on, programming had to conform to the narrow Iimits of speed and memory set byvacuum-tube circuitry. As largely exogenous factors in the electronics industry made it possible to expand those limits, and at the same time drastically lowered the cost of hardware, programming could take practical advantage of research into programming languages and compilers. Researchers' ideas of multiuser systems, interactive programming, or
virtual memory required advances in hardware at the same time that they drew out the full power of a new generation of machines. Just as new architectures have challenged established forms of programming, so too Theoretical advances in computation and artificial intelligencehave suggested new ways
of organizing processors (e.g. Backus 1977).



 At present, the evolution of computing
as a system and of its interfaces with other
systems of thought and action has yet to be
traced. Indeed, it is not clear how many
identifiable systems constitute computing
itself, given the diverse contexts in which it has developed. We speak of the computer industry as if it were a monolith rather than a network of interdependent industries with separate interests and concerns. In addition to historically more analytical studies of
individual firms, both large and small, we need analyses of t heir interaction and
interdependence. The same holds for
government and academia, neither of which
has spoken with one voice on matters of
computing. Of particular interest here may be The system-building role of the computer in forging new links of interdependence among universities, government, and industry after World War II.
 

Arguing in "The Big Questions" that
creators of the machinery underpinning the
American System worked from a knowledge
of the entire sequence of operations in
production,12 Daniels (1970) pointed to Peter
Drucker's suggestion that "the organization of
work be used as a unifying concept in the
history of technology." The recent volume by
Charles Bashe et al. on IBM's Early
Computers illustrates the potential fruitfulness
of that suggestion for the history of
computing. In tracing IBM's adaptation to the
computer, they bring out the corporate
tensions and adjustments introduced into IBM
by the need to keep abreast of fast-breaking
developments in science and technology and in
turn to share its research with others.13 The
computer reshaped R&D at IBM, defining new
relations between marketing and research,
introducing a new breed of scientific personnel
with new ways of doing things, and creating
new roles, in particular that of the
programmer. Whether the same holds true of,
say, Bell Laboratories or G.E. Research
Laboratories, remains to be studied, as does
the structure of the R&D institutions
established by the many new firms that
constituted the growing computer industry of
the '50s, '60s, and '70s. Tracy Kidder's (1981)
frankly journalistic account of development at
Data General has given us a tantalizing
glimpse of the patterns we may find. Equally
important will be studies of the emergence of
the data-processing shop, whether as an
independent computer service or as a new
element in established institutions.14 More
than one company found that the computer
reorganized de facto the lines of effective
managerial power.

The computer seems an obvious place
to look for insight into the question of whether
new technologies respond to need or create it.
Clearly, the first computers responded to the
felt need for high-speed, automatic calculation,
and that remained the justification for their
early development during the late '40s.
Indeed, the numerical analysts evidently
considered the computer to be their baby and
resented its adoption by "computerologists" in
the late '50s and early '60s (Wilkinson 1971).
But it seems equally clear that the computer
became the core of an emergent
data-processing industry more by creating
demand than by responding to it. Much as
Henry Ford taught the nation how to use an
automobile, IBM and its competitors taught
the nation's businesses (and its government)
how to use the computer. How much of the
technical development of the computer
originated in the marketing division remains an
untold story central to an understanding of
modern technology.15 Kidder's Soul of a New
Machine again offers a glimpse of what that
story may reveal.
 One major factor in the creation of
demand seems to have been the alliance

between the computer and the nascent field of
operations research/management science. As
the pages of the Harvard Business Review for
1953 show, the computer and operations
research hit the business stage together, each
a new and untried tool of management, both
clothed in the mantle of science. Against the
fanciful backdrop of Croesus' defeat by
camel-riding Persians, an IBM advertisement
proclaimed that "Yesterday ... 'The Fates'
Decided. Today ... Facts Are What Count".
Appealing to fact-based strides in "military
science, pure science, commerce, and
industry", the advertisement pointed beyond
data processing to "'mathematical models' of
specific processes, products, or situations, [by
which] man today can predetermine probable
results, minimize risks and costs." In less vivid
terms, Cyril C. Herrmann of MIT and John F.
Magee of Arthur D. Little introduced readers
of HBR to "'Operations Research' for
Management" (1953), and John Diebold
(1953) proclaimed "Automation - The New
Technology". As Herbert Simon (1960, p.14)
later pointed out, operations research was
both old and new, with roots going back to
Charles Babbage and Frederick W. Taylor. Its
novelty lay precisely in its claim to provide
'mathematical models' of business operations
as a basis for rational decision-making.
Depending for t heir sensitivity on
computationally intensive algorithms and large
volumes of data, those models required the
power of the computer.
 It seems crucial for the development
of the computer industry that the business
community accepted the joint claims of OR
and the computer long before either could
validate them by, say, cost-benefit analysis.
The decision to adopt the new methods of
"rational decision-making" seems itself to have
been less than fully rational:
 As business managers we are
revolutionizing the procedures
of our factories and offices
with automation, but what
about out decision making? In
other words, isn't there a
danger that our thought
processes will be left in the
horse-and-buggy stage while
our operations are being run
in the age of nucleonics,
electronics, and jet propulsion?
... Are the engineering and
scientific symbols of our age
significant indicators of a need
for change? (Hurni 1955, p.49)
Even at this early stage, the computer had
acquired symbolic force in the business
community and in society at large. We need to
know the sources of that force and how it
worked to weave the computer into the
economic and social fabric.16
The government has played a
determining role in at least four areas of
computing: microelectronics; interactive,
real-time systems; artificial intelligence; and
software engineering. None of these stories
has been told by an historian, although each
promises deep insight into the issues raised
above. Modern weapons systems and the
space program placed a premium on
miniaturization of circuits. Given the costs of
research, development, and tooling for
production, it is hard to imagine that the


integrated circuit and the microprocessor
would have emerged --at least as quickly as
they did-- without government support. As
Frank Rose (1984) put it in Into the Heart of
the Mind, "The computerization of society ...
has essentially been a side effect of the
computerization of war.(p.36)" More is
involved than smaller computers. Architecture
and software change in response to speed of
processor and size of memory. As a result, the
rapid pace of miniaturization tended to place
already inadequate methods of software
production under the pressure of rising
expectations. By the early 1970s the
Department of Defense, as the nation's single
largest procurer of software, had declared a
major stake in the development of software
engineering as a body of methods and tools for
reducing the costs and increasing the reliability
of large programs.
 As Howard Rheingold (1985) has
described in Tools for Thought the
government was quick to seize on the interest
of computer scientists at MIT in developing
the computer as an enhancement and extension
of human intellectual capabilities. In general,
that interest coincided with the needs of
national defense in the form of interactive
computing, visual displays of both text and
graphics, multi-user systems, and
inter-computer networks. The Advanced
Research Projects Agency (later DARPA),
soon became a source of almost unlimited
funding for research in these areas, a source
that bypassed the usual procedures of scientific
funding, in particular peer review. Much of the
early research in artificial intelligence derived
its funding from the same source, and its
development as a field of computer science
surely reflects that independence from the
agenda of the discipline as a whole.
 Although we commonly speak of
hardware and software in tandem, it is worth
noting that in a strict sense the notion of
software is an artifact of computing in the
business and government sectors during the
'50s. Only when the computer left the research
laboratory and the hands of the scientists and
engineers did the writing of programs become
a question of production. It is in that light that
we may most fruitfully view the development
of programming languages, programming
systems, operating systems, database and file
management systems, and communications and
networks, all of them aimed at facilitating the
work of programmers, maintaining managerial
control over them, and assuring the reliability
of their programs. The Babel of programming
languages in the '60s tends to distract attention
from the fact that three of the most commonly
used languages today are also among the
oldest: FORTRAN for scientific computing,
COBOL for data processing, and LISP for
artificial intelligence. ALGOL might have
remained a laboratory language had it and its
offspring not become the vehicles of structured
programming, a movement addressed directly
to the problems of programming as a form of
production.17
 Central to the history of software is
the sense of "crisis" that emerged in the late
'60s as one large project after another ran
behind schedule, over budget, and below
specifications. Though pervasive throughout
the industry, it posed enough of a strategic
threat for the NATO Science Committee to
convene an international conference in 1968


to address it. To emphasize the need for a
concerted effort along new lines, the
committee coined the term "software
engineering", reflecting the view that the
problem required the combination of science
and management thought characteristic of
engineering. Efforts to define that combination
and to develop the corresponding methods
constitute much of the history of computing
during the 1970s, at least in the realm of large
systems, and it is the essential background to
the story of Ada in the 1980s. It also reveals
apparently fundamental differences between
the formal, mathematical orientation of
European computer scientists and the
practical, industrial focus of their American
counterparts. Historians of science and
technology have seen those differences in the
past and have sought to explain them. Can
historians of computing use those explanations
and in turn help to articulate them?
 The effort to give meaning to
"software engineering" as a discipline and to
define a place for it in the training of computer
professionals should call the historian's
attention to the constellation of questions
contained under the heading of "discipline
formation and professionalization". In 1950
computing consisted of a handful of specially
designed machines and a handful of specially
trained programmers. By 1955 some 1000
general-purpose computers required the
services of some 10,000 programmers. By
1960, the number of devices had increased
fivefold, the number of programmers sixfold.
And so the growth continued. With it came
associations, societies, journals, magazines,
and claims to professional and academic
standing. The development of these
institutions is an essential part of the the social
history of computing as a technological
enterprise. Again, one may ask to what extent that development has followed historical patterns of institutionalization and to what extent it has created its own.
 The question of sources illustrates
particularly well how recent work in the history of technology may provide important guidance to the history of computing, at the same time that the latter adds new perspectives to that work. As noted above, historians of technology have focused new attention on the non-verbal expressions of engineering practice. Of the three main strands of computing, only theoretical computer science is essentially verbal in nature. Its sources come in the form most familiar to historians of science, namely books, articles, and other less formal pieces of writing, which by and large encompass the thinking behind them. We know pretty well how to read them, even for what they do not say explicitly. Similarly, at the level of institutional and social history, we seem to be on familiar ground, suffering largely from an embarrassment of wealth unwinnowed by time.
 But the computers themselves and the programs that were written for them constitute a quite different range of sources and thus pose the challenge of determining how to read them. As artifacts, computers present the problem of all electrical and electronic devices. They are machines without moving parts. Even when they are running, they display no internal action to explain their outward behavior. Yet, Tracy Kidder's (1981) portrait of Tom West sneaking a look at the boards of the new Vax to see how DEC had gone about its work reminds us that the actual machines may hold tales untold by manuals, technical reports, and engineering drawings. Those sources too demand our attention. When imaginatively read, they promise to throw light not only on the designers but also on those for whom they were designing. Through the hardware and its attendant sources one can follow the changing physiognomy of computers as they made their way from the laboratories and large
installations to the office and the home.


Today's prototypical computer iconically links television to typewriter. How that form emerged from a roomful of tubes and switches is a matter of both technical and cultural history.



 Though hard to interpret, the
hardware is at least tangible. Software by contrast is elusively intangible. In essence, it is the behavior of the machines when running. It is what converts their architecture to action, and it is constructed with action in mind; the programmer aims to make something happen. What, then, captures software for the historical record? How do we document and preserve an historically significant compiler, operating sysem, or database? Computer scientists have pointed to the limitations of the static program text as a basis for determining the program's dynamic behavior, and a provocative article (DeMillo et al. 1979) has questioned how much the written record of programming can tell us about the behavior of programmers. Yet, Gerald M. Weinberg (1971, Chapter 1) has given an example of how programs maybe read to reveal the machines and peoplebehind them. In a sense, historians of computing encounter from the opposite direction the problem faced by the software industry: what constitutes an adequate and reliable surrogate for an actually running
program? How, in particular, does the historian recapture, or the producer anticipate, the component that is always missing from the static record of software, namely the user for whom it is written and whose behavior is an essential part of it?
 

Placing the history of computing in the context of the history of technology promises a peculiarly recursive benefit. Although computation by machines has a long history, computing in the sense I have been using here did not exist before the late 1940s. There were no computers, no programmers, no computer scientists, no computer managers. Hence those who invented and improved the computer, those who determined how to program it, those who defined its scientific foundations, those who established it as an industry in itself and introduced it into business and industry all
came to computing from some other background. With no inherent precedents for Their work, they had to find their own
precedents. Much of the history of computing, certainly for the first generation, but probably also for the second and third, derives from the precedents these people drew from their past experience. In that sense, the history of technology shaped the history of computing, and the history of computing must turn to the history of technology for initial bearings.


 A specific example may help to
illustrate the point. Daniels (1970) stated as one of the really big questions the development of the 'American System' and its culmination in mass production. It is perhaps the central fact of technology in 19th-century America, and every historian of the subject must grapple with it. So too, though Daniels did not make the point, must historians of 20th-century technology. For massproduction has become an historical touchstone for modern engineers, in the area of software as well as elsewhere.
 

For instance, in one of the major
invited papers at the NATO Software
Engineering Conference of 1968, M.D.
McIlroy of Bell Telephone Laboratories looked forward to the end of a "preindustrial era" in programming. His metaphors and
similes harked back to the machine-tool industry and its methods of production. We undoubtedly produce

software by backward
techniques. We undoubtedly
get the short end of the stick in
confrontations with hardware
people because they are the
industrialists and we are the
crofters. Software production
today appears in the scale of
industrialization somewhere
below the more backward
construction industries. I think
its proper place is considerably
higher, and would like to
investigate the prospects for
mass-production techniques in
software.(McIlroy, 1969)
 

What McIlroy had in mind was not replication in large numbers, which is trivial for the computer, but rather programmed modules that might serve as standardized,
interchangeable parts to be drawn from the library shelf and inserted in larger production programs. A quotation from McIlroy's paper served as leitmotiv to the first part of Peter Wegner's series on "Capital Intensive Software Technology" in the July 1984 number of IEEE Software, which was richly illustrated by photographs of capital industry in the 1930s and included insets on the history of technology.18 By then McIlroy's equivalent to Interchangeable parts had become "reusable software" and software engineers had developed more sophisticated tools for producing it. Whether they were (or now are ) any closer to the goal is less important to the historian than the continuing strength of the model. It reveals historical self-consciousness.
 

We should appreciate thatself-consciousness at the same time that we view it critically, resisting the temptation to accept the comparisons as valid. An activity's
choice of historical models is itself part of the history of the activity. McIlroy was not describing the state or even the direction of software in 1968. Rather, he was proposing an historical precedent on which to base its future development. What is of interest to the historian of computing is why McIlroy chose the model of mass production as that precedent. Precisely what model of mass production did he have in mind, why did he think it appropriate or applicable to software, why did he think his audience would respond well to the proposal, and so on? The history of technology provides a critical context for evaluating the answers, indeed for shaping the questions. For historians, too, the evolving techniques of mass production in the 19th century constitute a model, or prototype, of
technological development. Whether it is one model or a set of closely related models is a matter of current scholarly debate, but some features seem clear. As a system it rested on foundations established in the early and mid-19th century, among them in particular the development of the machine-tool industry, which, as Nathan Rosenberg (1963) has shown, itself followed a characteristic and revealing pattern of innovation and diffusion of new techniques. Even with the requisite precision machinery, methods of mass production did not transfer directly or easily from one industry to another, and its introduction often took place in stages peculiar to production process involved (Hounshell 1984). Software production may prove to be the latest variation of the model, or critical history of technology may show how it has not fit.





12Elting E. Morison (1974) has pursued this point
along slightly different but equally revealing lines.
13Lundstrom (1987) has recently chronicled the failure
of some companies to make the requisite adjustments.
14The obvious citations here are Kraft (1977) and
Greenbaum (1979), but both works are concerned more
with politics than with computing, and the focus of
their political concerns, the "deskilling" of
programmers through the imposition of methods of
structur ed programming, has proved ephemeral, as
subsequent experience and data show that
programmers have made the transition with no
significant loss of control over their work; cf. Boehm
(1981).
15See, for example, Burke (1970): "Thus technological
innovation is not the product of society as a whole but
emanates rather from certain segments within or
outside of it; the men or institutions responsible for the
innovation, to be successful, must 'sell' it to the general
public; and innovation does have the effect of creating
broad social change.(p.23)" Ferguson (1979a) has
made a similar observation about selling new
technology.

16Along these lines, historians of computing would do
well to remember that a line of writings on the nature,
impact, and even history of computing stretching from
Edmund C. Berkeley's (1949) Giant Brains through
John Diebold's several volumes to Edward
Feigenbaum's and Pamela McCorduck's (1983) The
Fifth Generation stems from people with a product to
sell, whether management consulting or expert
systems.


17An effort at international cooper ation in establishing
a standard progr amming language, ALGOL from its
inception in 1956 to its final (and, some argued,
over-refined) form in 1968 provides a multileveled
view of computing in the '60s. While contributing
richly to the conceptual development of programming
languages, it also has a political history which carries
down to the present in differing directions of research,
both in computer science and, perhaps most clearly, in
software engineering.


18One has to wonder about an article on software
engineering that envisions progress on an industrial
model and uses photographs taken from the Great
Depression.


Part :- 6








The Tripartite Nature of Computing





PART :- 4







Privis part (click now) :- see on part :-4





Next Post Previous Post
No Comment
Add Comment
comment url