Rethinking Randomness, An Interview with Jeff Buzen, Part I
Authors
Denning, Peter J.
Buzen, Jeff
Advisors
Second Readers
Subjects
Date of Issue
2016-08
Date
August 2016
Publisher
Association for Computing Machinery (ACM)
Language
Abstract
Editor's Introduction: For more than 40 years, Jeffrey Buzen has been a leader in performance prediction of computer systems and networks. His first major contribution was an algorithm, known now as Buzen’s Algorithm, that calculated the throughput and response time of any practical network of servers in a few seconds. Prior algorithms were useless because they would have taken months or years for the same calculations. Buzen’s breakthrough opened a new industry of companies providing performance evaluation services, and laid scientific foundations for designing systems that meet performance objectives. Along the way, he became troubled by the fact that the real systems he was evaluating seriously violated his model’s assumptions, and yet the faulty models predicted throughput to within 5 percent of the true value and response time to within 25 percent. He began puzzling over this anomaly and invented a new framework for building computer performance models, which he called operational analysis. Operational analysis produced the same formulas, but with assumptions that hold in most systems. As he continued to understand this puzzle, he formulated a more complete theory of randomness, which he calls observational stochastics, and he wrote a book Rethinking Randomness laying out his new theory. We talked with Jeff Buzen about his work. (Peter J. Denning, Editor in Chief)
Type
Article
Description
Series/Report No
Department
Computer Science (CS)
Organization
Identifiers
NPS Report Number
Sponsors
Funding
Format
Citation
Denning, Peter J. "Rethinking Randomness: An interview with Jeff Buzen, Part I." Ubiquity 2016. August (2016): 1.
