Abstract:
Data analysis is indispensable to every science and engineering endeavor, but it always
plays the second fiddle to the subject area. The existing methods of data analysis
either the probability theory or the spectral analysis are all developed by
mathematicians or based on their rigorous rules. In pursue of the rigorous, we are
forced to make idealized assumptions and live in a pseudo-real linear and stationary
world. But the world we live in is neither stationary nor linear. For example, spectral
analysis is synonymous with the Fourier based analysis. As Fourier spectrum can only
give meaningful interpretation to linear and stationary process, its application to data
from nonlinear and nonstationary processes is problematical. And probability
distributions can only represent global properties, which imply homogeneity (or
stationarity) in the population. As scientific research getting increasingly sophistic,
the inadequacy is become glaringly obvious. The only alternative is to break away from
these limitations; we should let data speak for themselves so that the results could
reveal the full range of consequence of nonlinearity and nonstationarity. To do so, we
need new paradigm of data analysis methodology without a priori basis to fully
accommodating the variations of the underlying driving mechanisms. That is an adaptive
data analysis method, based on the Empirical Mode Decomposition and Hilbert Spectral
Analysis. The result is present in a time-frequency-energy representation. In fact, we
can only define true frequency with adaptive method, which would lead to quantify
nonstationarity and nonlinearity. Examples from classic nonlinear system and recent
climate will be used to illustrate the prowess of the new approach.
Tea Time: 3:30pm, R707