Big data and reliability applications: The complexity dimension
Date
Authors
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Big data features not only large volumes of data but also data with complicated structures. Complexity imposes unique challenges on big data analytics. Meeker and Hong (2014; Quality Engineering, pp. 102–16) provided an extensive discussion of the opportunities and challenges on big data and reliability; they also described engineering systems which generate big data that can be used in reliability analysis. Meeker and Hong (2014) focused on large-scale system operating and environment data (i.e., high-frequency multivariate time series data) and provided examples on how to link such data as covariates to traditional reliability responses such as time to failure, time to recurrence of events, and degradation measurements. This article intends to extend that discussion by focusing on how to use data with complicated structures to do reliability analysis. Such data types include high-dimensional sensor data, functional curve data, and image streams. We first provide a review of recent developments in those directions, then we provide a discussion on how analytical methods can be developed to tackle the challenging aspects that arise from the complex features of big data in reliability applications. The use of modern statistical methods such as variable selection, functional data analysis, scalar-on-image regression, spatio-temporal data models, and machine-learning techniques will also be discussed.
Series Number
Journal Issue
Is Version Of
Versions
Series
Academic or Administrative Unit
Type
Comments
This is an Accepted Manuscript of an article published by Taylor & Francis as Hong, Yili, Man Zhang, and William Q. Meeker. "Big data and reliability applications: The complexity dimension." Journal of Quality Technology 50, no. 2 (2018): 135-149. DOI: 10.1080/00224065.2018.1438007 . Posted with permission.