Shifting identities in computing. (with Matti Tedre). 2017. The computing field has undergone a number of identity shifts since the 1950s. Early on it focused on computing technology and various phenomena arising from its use. Over the years, the field's identity expanded to include programming, computational methods, and information processes both artificial and natural. One of the most profound shifts was in the rise of computational science, where computing became a new way to do science, joining the traditions of theory and experiment. Another profound shift was the rise of computational thinking in many fields.
Cyber Security is harder than building bridges. (with Dorothy Denning). 2016. Protecting the Internet and online computerized systems from attack is a difficult, messy problem. Here's why.
Interview: Peter Denning. 2013. (by Drew Amarosi) Computer scientist and former Association for Computing Machinery president Peter Denning details in an interview how fundamental security principles compiled by computing innovators were lost with the advent of the PC era. Denning notes that the interconnectivity of the modern environment was missing in the formative years of computing, and he points out that connectivity "just expands the problem" of data protection. Denning maintains that security and protection were recognized from the very beginning as critical issues. He observes that by the time the PC revolution began, operating systems were quite large, and he cites an animosity against them by aspiring PC system pioneers "because they thought that it resulted in corporations blocking the small guy out of using computers." Denning says these pioneers therefore lacked historical knowledge of the security issues that an older generation of mainframe technologists dealt with, which effectively delayed addressing such issues for decades. Still, Denning notes that these basic issues are being revisited, by the process of "resurrecting old knowledge and adapting it to the new world." Denning also points to the fact that it is security of the overall network, rather than the operating system, that now commands attention. "These issues transcend individual operating systems," he says.
The Information Paradox. 2012. (With Tim Bell) American Scientist, Nov-Dec. Classical information theory has no room for meaning -- but humans persist in assigning meaning. How can we reconcile this difference?
Reflections on a Symposium on Computation. 2012. PJD reflects on the Ubiquity symposium (see next item) and adds two more questions to the unsettled list -- What is Information? What is an algorithm?
What is computation? (Ubiquity September 2010) The standard reference model for computation, the Turing machine, is a powerful model for digital computers and it can simulate every other computation model ever proposed. Yet the Turing machine information process -- execution sequences of machine configurations -- is not as well matched for the natural, interactive, and continuous information processes frequently encountered today. Other models more closely match the information processes involved and give better predictions of running time and space. PJD organized a symposium of leading thinkers to explore this question.
Operational Analysis of Queueing Networks. 1977. (With Jeff Buzen.) From Measuring, Modelling, and Evaluating Computer Systems, North-Holland, 1977. This classic article demonstrated that the classic equations of queueing network models, leading to the famous product-form solution, could be derived purely from operational assumptions; the steady-state Markovian assumptions were not needed. The three operational assumptions were flow balance, one-step behavior, and homogeneity. Homogeneity is the most important of the three. It is a form of empirical independence where we assume that the flow out of a server depends only on the queue length of the server and not on any other server. These assumptions are weaker than the traditional stochastic assumptions. That the product form solution holds for networks with these assumptions explains why the product-form model works so well in real systems, even those with serious violations of the Markovian assumptions.
Operational Analysis of Queueing Network Models. 1978. (With Jeff Buzen.) From ACM Computing Surveys, Sept 1978. This classic article laid out a new framework for understanding models of computer systems and enabled the communication of the benefits of those models to a much wider audience.
Operational treatment of queue distributions and mean value analysis. 1980. (With Jeff Buzen.) From Computer Performance, June 1980. Here Operational Analysis is extended to deal with observed queue length distributions and the arrival theorem that is the basis for Mean Value Analysis. Relationships among the queue-length distributions seen by an arriving job, a completing job, and an outside observer are derived using operational analysis. A simplified derivation of the Sevcik-Mitrani arrival theorem is presented and used as the basis for discussing Reiser and Lavenberg's Mean Value Analysis. Two results are presented: an algorithm for computing queue length distributions from conditional throughputs in closed, product-form queueing networks; and an operational bound on the error that can arise in certain theorems when homogenity is violated.
Performance Modeling: Experimental Computer Science at its Best. 1981. A PJD President's letter from Communications of ACM, November 1981. Demonstrates how studies of virtual memory and queueing networks, which featured strong interplay of theory, practice, and experiment, led to deep scientific and engineering understanding of these two areas of computing.
Highly Parallel Computation. 1990. Science magazine, November. (With Walter Tichy.) The 1980s brought a rapid expansion in the use of supercomputers in science. These computers had opened up a new way to make scientific discoveries through modeling and simulation. Computation had become a third pillar of science alongside theory and experiment. A new interdisciplinary field called Computational Science had been founded, recognized by the US Congress in its High Performance Computing and Communication Act. This article gave a technical overview of supercomputing with highly parallel architectures and the challenges ahead for computational scientists.
Advanced Operating Systems (with Robert Brown and Walter Tichy). 1984. Far from fading out of the picture, operating systems continue to meet the challenges of new hardware developments. Their success lies in abstraction levels that hide nonessential details from the user. (See next article for an update in 1999.)
Operating Systems: Conceptual History. 1999. Prepared in 1999 for Encyclopedia of Computer Science, edited by Ralston, Reilley, and Hemmendinger. Co-authored with Walter Tichy and James Hunt. An overview of the functional levels of modern operating systems and how they evolved over time since the 1950s. The reference model described here has remained a good model for an OS kernel up until the present day.
Operational Analysis. 2004. In honor of Ken Sevcik on his 60th birthday, June 2004. PJD reminisces about the birth of operational analysis, its initial controversies, and the key role Ken played in settling them.
On the Subject of Objects. 2005. My students seemed to struggle a lot with object programming. After I put together this history of the ideas behind object programming, they understood.
Is Computer Science Science? April 2005. From ACM Communications. Computer Science meets every criterion for being a science, but it has a self-inflicted credibility problem. (Spanish version.)
Computing is a Natural Science. July 2007. From ACM Communications. Information processes and computation continue to be found abundantly in the deep structures of many scientific fields. Computing is not -- in fact, never was -- a science only of the artificial.