Abstract: | In this thesis we generalize the Hansen and Seo test in the R package tsDyn, which tests a linear cointegration model against a two-regime threshold cointegration model, to the case of three regimes in the alternative hypothesis. As the Lagrange Multiplier test statistic used in the Hansen and Seo test in tsDyn is different from the LM statistic described in Hansen and Seo (2002), we generalize both these LM statistics, and show that they are equal under certain conditions. The grid search algorithm, which is necessary when maximizing this LM statistic, is also extended to the case of three regimes, and it is rewritten such that if the cointegration value is given, it really maximizes the LM statistic under the constraints specified by the user. In our empirical studies we have examined thoroughly the bivariate time series consisting of the monthly NIBOR rates of the maturities tomorrow next and 12 months. When modeling this bivariate time series, we find strong evidence for a two-regime TVECM being superior to a linear VECM, and in our out-of-sample forecasting the two-regime SETAR model gives much better prediction of the cointegration relation than an AR model. When testing a two-regime SETAR model for the cointegration relation against a three-regime model, the two-regime model cannot be rejected at any reasonable significance level. In addition, we show how influential a few outliers may be by removing them from the time series and rerunning some of the statistical tests. Also, we have tested all the 66 possible pairs of Norwegian interest rates for cointegration, and we have tested the term spread of each pair for threshold effects, i.e., testing a linear model against a two-regime model, as well as testing a two-regime model against a three-regime model. We find a lot of cointegrated pairs, and we find evidence for a two-regime model in approximately 50 % of the cases, and evidence for a three-regime model in some cases in this univariate time series analysis. At last, we simulate a bivariate time series with a three-regime threshold cointegration model as data generation process, and estimate a three-regime threshold cointegration model from this simulated time series. Thus, we illustrate that the thresholds which our version of the Hansen and Seo test detects as optimal, are close to the original thresholds used in the simulation. As expected, a linear model for this bivariate time series is strongly rejected, and there is strong evidence for a three-regime threshold model for the cointegration relation being superior to both a linear model and a two-regime threshold model. |
URI: | http://hdl.handle.net/10037/4370 |
Abstract: | This work concentrates on using the ICM algorithm for image restoration. The algorithm has been applied to fMRI recordings and mole pictures |
URI: | http://hdl.handle.net/10037/2071 |
Abstract: | All points on the surface of the Earth are moving. To define the velocity of a given point, we can place a GPS receiver there and measure the coordinates every day. After collecting enough data, we can generate a time series of three coordinates, North, East and Height directions. The most used technique to determine such displacements, is the linear model. The main objective of this thesis is to show how to estimate the velocity of a given point, using statistical methods to improve the results. The improvement of the site velocity achieved by exluding all signals that are not tec- tonic origine (seasonal variations, spacially correlated noise reduction ). Time series for all directions contain gaps(missing data), outliers, offsets and various data length. The data discontinuities are detected and corrected by a simple algorithm, based on binary search to detect the time of abruption. The outliers are eliminated by using robust estimation techniques. Simulation is used to fill the gaps. The data obtained from permanent GPS-stations in Norway and some other European countries are unevenly sampled. We therefore use the Lomb-Scargle method to perform spectral analysis. This allows us to detect annual and interannual variations. The methods of Principal Components (also known as Empirical Orthogonal Functions, or EOF) and Factor Analysis are used to correct for common fluctuations. We use data from 8 permanent GPS-stations (SATREF) in these investigations. |
URI: | http://hdl.handle.net/10037/1963 |
Abstract: | During the last decades, the incidence rate of cutaneous malignant melanoma, a type of skin cancer developing from melanocytic skin lesions, has risen to alarmingly high levels. As there is no effective treatment for advanced melanoma, recognizing the lesion at an early stage is crucial for successful treatment. A trained expert dermatologist has an accuracy of around 75 % when diagnosing melanoma, for a general physician the number is much lower. Dermoscopy (dermatoscopy, epiluminescence microscopy (ELM)) has a positive effect the accuracy rate, but only when used by trained personnel. The dermoscope is a device consisting of a magnifying glass and polarized light, making the upper layer of the skin translucent. The need for computer-aided diagnosis of skin lesions is obvious and urgent. With both digital compact cameras and pocket dermoscopes that meet the technical demands for precise capture of colors and patterns in the upper skin layers, the challenge is to develop fast, precise and robust algorithms for the diagnosis of skin lesions. Any unsupervised diagnosis of skin lesions would necessarily start with unsupervised segmentation of lesion and skin. This master's thesis proposes an algorithm for unsupervised skin-lesion segmentation and the necessary pre-processing. Starting with a digital dermoscopic image of a lesion surrounded by healthy skin, the pre-processing steps are noise filtering, illumination correction and removal of artifacts. A median filter is used for noise removal, because of its edge-preserving capabilities and computer efficiency. When the dermoscope is put in contact with the patient's skin, the angle between the skin and the magnifying glass impacts on the distribution of the light emitted from the diodes attached to the dermoscope. Scalar multiplication with an illumination correction matrix, individually adapted to each image, facilitates the analysis of the image, especially for skin lesions of light color. Artifacts such as scales printed on the glass of the dermoscope, hairs and felt-pen marks on the patient's skin are all obstacles for correct segmentation. This thesis proposes a new, robust and computer effective algorithm for hair removal, based on morphological operations of binary images. The segmentation algorithm is based on global thresholding and histogram analysis. Unlike most segmentation algorithms based on histogram analysis, the algorithm proposed in this thesis makes no assumptions on the characterization of the lesion mode. From the truecolor RGB image, the first principal component is used as grayscale image. The algorithm searches for the peak of the skin mode, and the skin mode's left bound. The pixel values belonging to the bins to the left of the bound, are regarded as samples from an underlying distribution and the expected value of this distribution is estimated. The value of the pixels in the bin located at equal distance from the expected value of the lesion mode, and the skin-mode peak is used as threshold value. After global thresholding, post-processing is applied to identify the lesion object. The only parameters in this algorithm are the number of bins in the histogram and the shape of the local minimum regarded as skin-mode bound. The dermoscopic images have been divided into two independent sets; training set and test set. The training set consists of 68 images, and the test set consists of 156 images. 80 of the images from the test set have been evaluated by expert dermatologists by visual inspection. |
URI: | http://hdl.handle.net/10037/1707 |
Munin is powered by DSpace 1.8.2
The University Library of Tromsø, N-9037 Tromsø
Tel: +47 77 64 40 00, E-mail: munin@ub.uit.no