Software reliability model with optimal selection of failure data
Abstract
In the use of software reliability models it is not
necessarily the case that all the failure data should be used to
estimate model parameters and to predict failures. The reason for
this is that old data may not be as representative of the current
and future failure process as recent data. Therefore, it may be
possible to obtain more accurate predictions of future failures
by excluding or giving lower weight to the earlier failure counts.
Although “data aging” techniques such as moving average and
exponential smoothing are frequently used in other fields, such
as inventory control, we did not find use of data aging in the
various models we surveyed. One model that includes the concept
of selecting a subset of the failure data is the Schneidewind Non-
Homogeneous Poisson Process (NHPP) software reliability model.
In order to use the concept of data aging, there must be a criterion
for determining the optimal value of the starting failure count
interval. We evaluated four criteria for identifying the optimal
starting interval for estimating model parameters. Three of the
criteria are novel. WOof these treat the failure count interval
index as a parameter by substituting model functions for data
vectors and optimizing on functions obtained from maximum
likelihood estimation techniques. The third one uses weighted
least squares to maintain constant variance in the presence of
the decreasing failure rate assumed by the model. The fourth
criterion is the familiar mean square error. Our research showed
that significantly improved reliability predictions can be obtained
by using a subset of the failure data, based on applying the
appropriate criteria, and using the Space Shuttle On-Board
software as an example.
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.Collections
Related items
Showing items related by title, author, creator and subject.
-
Test Reduction in Open Architecture via Dependency Analysis
Berzin, Valdis; Lim, Peter; Kahia, Mohsen Ben (Monterey, California. Naval Postgraduate School, 2011-04-30); NPS-AM-11-C8P10R02-043In the Verification and Validation (V&V) phase, whenever there is a newer release of a given program, test engineers need to re-conduct all the tests performed on the previous program release''a costly process known as ... -
Strain rate dependent failure criteria for fibrous composites using multiscale approach
Kwon, Y.W.; Panick, C.J. (Springer, 2020);Recently, a set of failure criteria based on a multiscale model was developed for fibrous composites. Those criteria used stresses and strains occurring in the fiber and matrix material level. The failure criteria consisted ... -
EXPLORING THE USE OF HUMAN RELIABILITY AND ACCIDENT INVESTIGATION METHODS TO INFLUENCE DESIGN REQUIREMENTS FOR NAVAL SYSTEMS
Whitehead, Cindy R. (Monterey, CA; Naval Postgraduate School, 2020-09);This thesis explores whether established methods from human reliability analysis and accident investigation can be applied early in system development to identify the design vulnerabilities that increase risk of system ...