Department of Mathematics
Permanent URI for this community
Browse
Browsing Department of Mathematics by Author "Belbas, Stavros Apostol"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Estimation of the Weibull distribution with applications to tornado climatology(University of Alabama Libraries, 2012) McClellan, Michael B.; Belbas, Stavros Apostol; University of Alabama TuscaloosaSome general properties of the Weibull distribution are discussed. The mathematical development of the distribution is linked to the family of extreme value distributions, and the origins in science are found to be related to survival analysis. Some generalizations of the distribution are noted, and a limited discussion of its numerous applications undertaken. One such application is the Weibull model of tornado intensity developed by Dotzek, Grieser, and Brooks (2003). In an attempt to improve this model, several methods for estimating the parameters of the Weibull distribution are discussed. Maximum likelihood estimation is found to be the best method of estimation for the two-parameter Weibull distribution with respect to the asymptotic estimator properties discussed. An existing algorithm to locate the maximum likelihood estimator for the three-parameter Weibull distribution is described, and the complexities of the three-parameter case investigated. It is known that the maximum likelihood estimates for the Weibull distribution display bias for small sample sizes. An equation is analytically derived to estimate this small sample bias in the two-parameter case, and numerical unbiasing procedures discussed. Simulated data are analyzed using the methods developed, and the asymptotic properties of the estimates discussed for the two-parameter case. The estimation procedures are then applied to actual tornado intensity data from the April 25th - 28th, 2011 tornado outbreak as well as the historic records for both Alabama and the United States as a whole. In all cases, the Weibull model is found to be appropriate as judged by the Chi-squared test at 5 percent significance.Item Using Bayesian techniques with item response theory to analyze mathematics tests(University of Alabama Libraries, 2013) Maxwell, Mary; Gleason, Jim; University of Alabama TuscaloosaDue to the cost of a college education, final exams for college level courses fall under the category of ``high-stakes'' tests. An incorrectly measured assessment may result in students paying thousands of dollars toward retaking the course, scholarships being rescinded, or universities with students taking courses for which they are not prepared, as well as many other undesirable consequences. Therefore, faculty at colleges and universities must understand the reliability of these tests to accurately measure student knowledge. Traditionally, for large general education courses, faculty use common exams and measure their reliability using Classical Test Theory (CTT). However, the cut off scores are arbitrarily chosen, and little is known about the accuracy of measurement at these critical scores. A solution to this dilemma is to use Item Response Theory (IRT) models to determine the instrument's reliability at various points along the student ability spectrum. Since cost is always on the mind of faculty and administrators at these schools, we compare the use of free software (Item Response Theory Command Language) to generally accepted commercial software (Xcalibre) in the analysis of College Algebra final exams. With both programs, a Bayesian approach was used: Bayes modal estimates were obtained for item parameters and EAP (expected a posteriori) estimates were obtained for ability parameters. Model-data fit analysis was conducted using two well-known chi-square fit statistics with no significant difference found in model-data fit. Parameter estimates were compared directly along with a comparison of Item Response Functions using a weighted version of the root mean square error (RMSE) that factors in the ability distribution of examinees resulting in comparable item response functions between the two programs. Furthermore, ability estimates from both programs were found to be nearly identical. Thus, when the assumptions of IRT are met for the two- and three-parameter logistic and the generalized partial credit models, the freely available software program is an appropriate choice for the analysis of College Algebra final exams.Item Volatility analysis for high frequency financial data(University of Alabama Libraries, 2009) Zheng, Xiaohua; Wu, Zhijian; University of Alabama TuscaloosaMeasuring and modeling financial volatility are key steps for derivative pricing and risk management. In financial markets, there are two kinds of data: low-frequency financial data and high-frequency financial data. Most research has been done based on low-frequency data. In this dissertation we focus on high-frequency data. In theory, the sum of squares of log returns sampled at high frequency estimates their variance. For log price data following a diffusion process without noise, the realized volatility converges to its quadratic variation. When log price data contain market microstructure noise, the realized volatility explodes as the sampling interval converges to 0. In this dissertation, we generalize the fundamental Ito isometry and analyze the speed with which stochastic processes approach to their quadratic variations. We determine the difference between realized volatility and quadratic variation under mean square constraints for Brownian motion and general case. We improve the estimation for quadratic variation. The estimators found by us converge to quadratic variation at a higher rate.