This paper proposes a new method to trace the transmission loss in deregulated power system by applying Genetic Algorithm (GA) and Least Squares Support Vector Machine (LS-SVM). The idea is to use GA as an optimizer to find the optimal values of hyper-parameters of LS-SVM and adopt a supervised learning approach to train the LS-SVM model. The well known proportional sharing method (PSM) is used to trace the loss at each transmission line which is then utilized as a teacher in the proposed hybrid technique called GA-SVM method. Based on load profile as inputs and PSM output for transmission loss allocation, the GA-SVM model is expected to learn which generators are responsible for transmission losses. In this paper, IEEE 14-bus system is used to show the effectiveness of the proposed method ...
BackgroundThe objective of the present study was to test the ability of the partial least squares regression technique to impute genotypes from low density single nucleotide polymorphisms (SNP) panels i.e. 3K or 7K to a high density panel with 50K SNP. No pedigree information was used.MethodsData consisted of 2093 Holstein, 749 Brown Swiss and 479 Simmental bulls genotyped with the Illumina 50K Beadchip. First, a single-breed approach was applied by using only data from Holstein animals. Then, to enlarge the training population, data from the three breeds were combined and a multi-breed analysis was performed. Accuracies of genotypes imputed using the partial least squares regression method were compared with those obtained by using the Beagle software. The impact of genotype imputation on breeding value prediction was evaluated for milk yield, fat content and protein content.ResultsIn the single-breed approach, the accuracy of imputation using partial least squares regression was around 90 and 94% for
Objective: One of the important life-threatening ailment is stroke across the world. The current paper was performed to classify the outcome of stroke..
Provides (weighted) Partial least squares Regression for generalized linear models and repeated k-fold cross-validation of such models using various criteria. It allows for missing data in the explanatory variables. Bootstrap confidence intervals constructions are also available.
In this reaction, the hydrogen atom that belongs to cellulose hydroxyl is replaced by an acetic group from acetic anhydride, yielding a carboxylic acid and an ester group in the wood, which is less polar to strong hydroxyl. This strategy renders wood more compatible with apolar polymers such as polyolefins.. Hence, the objectives of this study were to verify the effects of the proportion of acetic acid/anhydride on the acetylation of wood flour at varying times and temperatures and to examine hydroxyl and carbonyl groups by FTIR. The influence of these parameters was verified by experiments with a factorial design and partial least squares regression, both useful techniques for obtaining information concerning the effect of parameters in a given process.25-28 EXPERIMENTAL. Materials. Wood flour (100 mesh) from Pinus sp. trees, with a density of 0.25 g/cm³, was supplied by Pinhopó Ltda. Acetic acid and acetic anhydride (both from Biotec) were used in different ratios in the acetylation ...
TY - JOUR. T1 - Predictive performance of bayesian and nonlinear least-squares regression programs for lidocaine. AU - Destache, Christopher J.. AU - Hilleman, Daniel E.. AU - Mohiuddin, Syed M.. AU - Lang, Patricia T.. PY - 1992. Y1 - 1992. N2 - The predictive performance of two computer programs for lidocaine dosing were evaluated. Two-compartment Bayesian and nonlinear least-squares regression programs were used in two groups of patients (15 acute arrhythmia patients and 14 chronic arrhythmia patients). Lidocaine was given as a 1.5 mg/kg bolus and a 2.8 mg/min infusion for 48 h. A second bolus (0.5 mg/kg) was given 10 min after the first bolus over 2 min. Serum samples of the patients receiving lidocaine were drawn at 2, 15, 30 min and 1, 2, and 4 h and were used in forecasting the serum concentrations at 6, 8, 12, and 48 h. Predictive performance was assessed by mean error and mean-squared error. The results (mean ± 95% confidence intervals) demonstrated the Bayesian program predicted a ...
Methods and Results-In 86 stable patients with HF and EF ≥45% in the Karolinska Rennes (KaRen) biomarker substudy, biomarkers were quantified by a multiplex immunoassay. Orthogonal projection to latent structures by partial least square analysis was performed on 87 biomarkers and 240 clinical variables, ranking biomarkers associated with New York Heart Association (NYHA) Functional class and the composite outcome (all-cause mortality and HF hospitalization). Biomarkers significantly correlated with outcome were analyzed by multivariable Cox regression and correlations with echocardiographic measurements performed. The orthogonal partial least square outcome-predicting biomarker pattern was run against the Ingenuity Pathway Analysis (IPA) database, containing annotated data from the public domain. The orthogonal partial least square analyses identified 32 biomarkers correlated with NYHA class and 28 predicting outcomes. Among outcome-predicting biomarkers, growth/differentiation factor-15 was ...
Leaf water content is one of the most common physiological parameters limiting efficiency of photosynthesis and biomass productivity in plants including Miscanthus. Therefore, it is of great significance to determine or predict the water content quickly and nondestructively. In this study, we explored the relationship between leaf water content and diffuse reflectance spectra in Miscanthus. Three multivariate calibrations including partial least squares (PLS), least squares support vector machine regression (LSSVR), and radial basis function (RBF) neural network (NN) were developed for the models of leaf water content determination. The non-linear models including RBF_LSSVR and RBF_NN showed higher accuracy than the PLS and Lin_LSSVR models. Moreover, 75 sensitive wavelengths were identified to be closely associated with the leaf water content in Miscanthus. The RBF_LSSVR and RBF_NN models for predicting leaf water content, based on 75 characteristic wavelengths, obtained the high determination
Partial least squares model based on low mass range (2.5-20 kDa) data from the supernatant fraction of three normal tissue (grey) and three tumour tissue sa
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed
Vibroarthographic (VAG) signals emitted from the knee joint disorder provides an early diagnostic tool. The nonstationary and nonlinear nature of VAG signal makes an important aspect for feature extraction. In this work, we investigate VAG signals by proposing a wavelet based decomposition. The VAG signals are decomposed into sub-band signals of different frequencies. Nonlinear features such as recurrence quantification analysis (RQA), approximate entropy (ApEn) and sample entropy (SampEn) are extracted as features of VAG signal. A total of twenty-four features form a vector to characterize a VAG signal. Two feature selection (FS) techniques, apriori algorithm and genetic algorithm (GA) selects six and four features as the most significant features. Least square support vector machines (LS-SVM) and random forest are proposed as classifiers to evaluate the performance of FS techniques. Results indicate that the classification accuracy was more prominent with features selected from FS algorithms. ...
Skinning injury on potato tubers is a kind of superficial wound that is generally inflicted by mechanical forces during harvest and postharvest handling operations. Though skinning injury is pervasive and obstructive, its detection is very limited. This study attempted to identify injured skin using two CCD (Charge Coupled Device) sensor-based machine vision technologies, i.e., visible imaging and biospeckle imaging. The identification of skinning injury was realized via exploiting features extracted from varied ROIs (Region of Interests). The features extracted from visible images were pixel-wise color and texture features, while region-wise BA (Biospeckle Activity) was calculated from biospeckle imaging. In addition, the calculation of BA using varied numbers of speckle patterns were compared. Finally, extracted features were implemented into classifiers of LS-SVM (Least Square Support Vector Machine) and BLR (Binary Logistic Regression), respectively. Results showed that color features performed
Lymph node status is not part of the staging system for cervical cancer, but provides important information for prognosis and treatment. We investigated whether lymph node status can be predicted with proteomic profiling. Serum samples of 60 cervical cancer patients (FIGO I/II) were obtained before primary treatment. Samples were run through a HPLC depletion column, eliminating the 14 most abundant proteins ubiquitously present in serum. Unbound fractions were concentrated with spin filters. Fractions were spotted onto CM10 and IMAC30 surfaces and analyzed with surface-enhanced laser desorption time of flight (SELDI-TOF) mass spectrometry (MS). Unsupervised peak detection and peak clustering was performed using MASDA software. Leave-one-out (LOO) validation for weighted Least Squares Support Vector Machines (LSSVM) was used for prediction of lymph node involvement. Other outcomes were histological type, lymphvascular space involvement (LVSI) and recurrent disease. LSSVM models were able to determine LN
Monotone function, such as growth function and cumulative distribution function, is often a study of interest in statistical literature. In this dissertation, we propose a nonparametric least-squares method for estimating monotone functions induced from stochastic processes in which the starting time of the process is subject to interval censoring. We apply this method to estimate the mean function of tumor growth with the data from either animal experiments or tumor screening programs to investigate tumor progression. In this type of application, the tumor onset time is observed within an interval. The proposed method can also be used to estimate the cumulative distribution function of the elapsed time between two related events in human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) studies, such as HIV transmission time between two partners and AIDS incubation time from HIV infection to AIDS onset. In these applications, both the initial event and the subsequent event are only
Anderson, G. J., and Mizon, G. E. (1983). `Parameter Constancy Tests: Old and New, Discussion Paper 8325, Economics Department, University of Southampton.. Andreou, E., and Ghysels, E. (2002). `Detecting Multiple Breaks in Financial Market Volatility Dynamics, Journal of Applied Econometrics, 17: 579-600.. Andrews, D. W. K. (1993). `Tests for Parameter Instability and Structural Change with Unknown Change Point, Econometrica, 61: 821-856.. Andrews, D. W. K. (2003). `End-of-Sample Instability Tests, Econometrica, 71: 1661-1694.. Andrews, D. W. K., and Fair, R. C. (1988). `Inference in Nonlinear Econometric Models with Structural Change, The Review of Economic Studies, 55: 615-639.. Bai, J. (1994). `Least Squares Estimation of a Shift in Linear Processes, Journal of Time Series Analysis, 15: 453-472.. Bai, J. (1995). `Least Absolute Deviation Estimation of a Shift, Econometric Theory, 11: 403-436.. Bai, J. (1997). `Estimation of a Change Point in Multiple Regression Models, Review of ...
It is usually more convenient to base programs for nonlinear regression on matrix algebra. This is the approach taken in higher level mathematical programming software such as that provided by Matlab and Mathcad (sources for the Matlab and Mathcad software are listed at the end of this chapter). The principles are exactly the same as in the algebraic approach discussed above, but matrix methods facilitate organization and manipulation of the data.. In matrix notation, the straight-line model can be expressed as [3, 5]. where Y is a vector containing the n values of y; (meas), X is an n X 2 sample matrix, e is a vector containing the observed residuals, and b is the vector containing values of the slope and intercept. For an example with n = 3, eq. (2.17) can be represented as in Box 2.1.. ...
Currently, many studies have focused on the magnitude of coherence with less emphasis on the time delay, or have mostly used only one method to establish the temporal relationship between the sensorimotor cortex and the peripheral muscles. Here, the time delays using inverse Fast Fourier transformation (IFFT), least squares regression analysis (LSR), weighted least squares regression analysis (WLSR), maximum coherence (MAX-COH) and mean of significant coherences (MEAN-COH) methods in the same subjects are compared to clarify the best method(s) for electroencephalography (EEG)- electromyography (EMG) temporal analysis. EEG activity and surface EMG activity from the first dorsal interosseous (FDI) muscle of the right hand were recorded in eight normal subjects during a weak contraction task. The current source density (CSD) reference method was estimated and used in the phase and temporal analysis. For the EEG and EMG time delay in the same subjects, MAX-COH, MEAN-COH and LSR methods are found to ...
Preface xiii. Acknowledgments xv. Abbreviations xvii. 1 Identification 1. 1.1 Introduction 1. 1.2 Illustration of Some Important Aspects of System Identification 2. Exercise 1 .a (Least squares estimation of the value of a resistor) 2. Exercise 1 .b (Analysis of the standard deviation) 3. Exercise 2 (Study of the asymptotic distribution of an estimate) 5. Exercise 3 (Impact of noise on the regressor (input) measurements) 6. Exercise 4 (Importance of the choice of the independent variable or input) 7. Exercise 5.a (combining measurements with a varying SNR: Weighted least squares estimation) 8. Exercise 5.b (Weighted least squares estimation: A study of the variance) 9. Exercise 6 (Least squares estimation of models that are linear in the parameters) 11. Exercise 7 (Characterizing a 2-dimensional parameter estimate) 12. 1.3 Maximum Likelihood Estimation for Gaussian and Laplace Distributed Noise 14. Exercise 8 (Dependence of the optimal cost function on the distribution of the disturbing noise) ...
A method for improving the results of radio location systems that incorporate weighted least squares optimization generalizes the weighted least squares method by using maximum a posteriori (MAP) probability metrics to incorporate characteristics of the specific positioning problem (e.g., UTDOA). Weighted least squares methods are typically used by TDOA and related location systems including TDOA/ AOA and TDOA/GPS hybrid systems. The incorporated characteristics include empirical information about TDOA errors and the probability distribution of the mobile position relative to other network elements. A technique is provided for modeling the TDOA error distribution and the a priori mobile position. A method for computing a MAP decision metric is provided using the new probability distribution models. Testing with field data shows that this method yields significant improvement over existing weighted least squares methods.
Hinterstoisser B., Schwanninger M., Rodrigues JC., Gierlinger N.: Determination of lignin content in Norway spruce wood by Fourier transformed near infrared spectroscopy and partial least squares regression analysis. Part 2: Development and evaluation of the final model, J Near Infrared Spec., 19(5), 2011, 331- ...
|p|In the present paper QSAR modeling using electrotopological state atom (E-state) parameters has been attempted to determine the antiradical and the antioxidant activities of flavonoids in two model systems reported by Burda et al. (2001). The antiradical property of a methanolic solution of 1, 1-diphenyl-2-picrylhydrazyl (DPPH) and the antioxidant activity of flavonoids in a β-carotenelinoleic acid were the two model systems studied. Different statistical tools used in this communication are stepwise regression analysis, multiple linear regressions with factor analysis as the preprocessing step for variable selection (FA-MLR) and partial least squares analysis (PLS). In both the activities the best equation is obtained from stepwise regression analysis, considering, both equation statistics and predictive ability (antiradical activity: R 2 = 0.927, Q2 = 0.871 and antioxidant activity: R 2 = 0.901, Q2 = 0.841). |inline-formula||alternatives| [...] |/alternatives||/inline
Genedata Analyst™ is the premier software solution for the integration and interpretation of experimental data in life science R&D. It puts rigorous statistical algorithms, interactive data analysis tools, and intuitive visualization into the hands of researchers and biostatisticians alike. Built on a scalable client-server architecture with a rich set of APIs, Genedata Analyst provides a centrally managed, secure, and scalable data mining platform that can be easily integrated into existing research IT environments. Advanced interactive data mining and visualizations are complemented by statistical applications including t-Test, ANOVA, linear models, Principal Components Analysis (PCA), Partial Least Square analysis (PLS), and many more. ...
1] Exploring Polymeric Nano-Particles as Targeted Pulmonary Delivery of Rifampicin, Ethambutol and Ofloxacin against Inh-Resistant Tuberculosis. J Lung Pulm Respir Res 4(1): 00116. DOI: 10.15406/jlprr.2017.04.00116 ISSN 2376-0060. [2] Preparation, Optimization and in Vitro Characterization of Cisplatin Loaded Novel Polymeric Micelles for Treatment of Lung Cancer. IJRSI International journal of research in scientific innovation 4 (1) : 431-441. ISSN 2321-2705. [3] Research paper entitled "Identification of key variables affecting drug release from lipid matrix in hydroalcoholic dissolution medium containing hydroxymethyl propyl cellulose" in Indian drugs 53(11) 2016 [4] Asha Patel, Mukesh Gohel, Tejal Soni Partial Least Square Analysis and Mixture Design for the study of the influence of composition variables on Nanoemulsions as drug carriers. Research Journal of Pharmacy and technology, 7(12): December, 2014, 1446-1452.. [5] Development of plant extract loaded Nanoemulsion for the treatment of ...
from http://www.cnblogs.com/tychyg/p/4868626.html Basis(基础): MSE(Mean Square Error 均方误差),LMS(LeastMean Square 最小均方),LSM(Least Square Methods 最小二乘法),MLE(MaximumLikelihood Estimati
A spectrophotometric method for selective complexation reaction and simultaneous determination of mycophenolate mofetil (MPM), and mycophenolic acid (MPA) using three multivariate chemometric methods, i.e. partial least squares regression, principal component regression and principal component artificial neural networks, is proposed. The method is based on the complexation reaction of MPM and MPA with Fe(III) ion in the solution. A nonionic surfactant, Triton X-100, was used for dissolving the complexes and intensifying the signals. The linear determination ranges for the determination of MPA and MPM were 5.0-215.0 mg l-1, and 10.0-1000.0 mg l-1, respectively. The detection limit for MPA and MPM was obtained as 0.3 mg l-1 and 1.1 mg l-1, respectively. Satisfactory results were obtained by the combination of spectrophotometric method and chemometrics techniques. The method was successfully applied to the simultaneous determination of MPM and MPA in serum sample and the results were comparable with HPLC
Machine fault prognosis techniques have been considered profoundly in the recent time due to their profit for reducing unexpected faults or unscheduled maintenance. With those techniques, the working conditions of components, the trending of fault propagation, and the time-to-failure are forecasted precisely before they reach the failure thresholds. In this work, we propose an approach of Least Square Regression Tree (LSRT), which is an extension of the Classification and Regression Tree (CART), in association with one-step-ahead prediction of time-series forecasting technique to predict the future conditions of machines. In this technique, the number of available observations is firstly determined by using Caos method and LSRT is employed as prognosis system in the next step. The proposed approach is evaluated by real data of low methane compressor. Furthermore, the comparison between the predicted results ...
p,The K-S tests were build upon the null hypothesis, F_n(x)=F(x), not on \hat{\theta}=\theta_o (parameter). The confidence band is laid upon the empirical distribution, if I put this in a simple word, a confidence band on the estimated p-value. I didnt see that they included any parameter estimation and its confidence band. ,/p, ,p,The χ^2 method provides a best fit based on least square methods. If the errors of the response variable are normally distributed, then this least square method provides an equal solution to the maximum likelihood estimator. Because of Wald (1949) and other series of works, we know that this ML estimator is consistent and asymptotically normal, so that the χ^2 number +1 could lead a 68% confidence interval (one parameter). This interval is only valid when the ML estimator is consistent. Protossov et.al. (2001) showed some of the regularity conditions on the model to reach this consistent estimator. Im not sure all astronomical models satisfy these conditions to ...
Quantile regression have its advantage properties comparing to the OLS model regression which are full measurement of the effects of a covariate on response, robustness and Equivariance property. In this paper, I use a survey data in Belgium and apply a linear model to see the advantage properites of quantile regression. And I use a quantile regression model with the raw data to analyze the different cost of family on different numbers of children and apply a Wald test. The result shows that for most of the family types and living standard, from the lower quantile to the upper quantile the family cost on children increases along with the increasing number of children and the cost of each child is the same. And we found a common behavior that the cost of the second child is significantly more than the cost of the first child for a nonworking type of family and all living standard families, at the upper quantile (from 0.75 quantile to 0.9 quantile) of the conditional distribution.. ...
Anyway, in either above cases the line is estimated by using the so-called "least squares" method: the "best" line is the one for which the sum of squares of differences between the points and line itself is minimum.. In first model, the "level" for which one can obtain the minimum of sum of squares of distance of points from line is at the average of weight values \((\overline{y}\)). The dashed green lines represent the distances of some observed points from the "best" (estimated) line.. In second model, the result of application of least-squares method is a few more complex, but can be obtained in closed form, as:. ...
The main objective of this paper is to apply genetic programming (GP) with an orthogonal least squares (OLS) algorithm to derive a predictive model for the compressive strength of carbon fiber-reinforced plastic (CFRP) confined concrete cylinders. The GP/OLS model was developed based on experimental results obtained from the literature. Traditional GP-based and least squares regression analyses were performed using the same variables and data sets to benchmark the GP/OLS model. A subsequent parametric analysis was carried out and the trends of the results were confirmed via previous laboratory studies. The results indicate that the proposed formula can predict the ultimate compressive strength of concrete cylinders with an acceptable level of accuracy. The GP/OLS results are more accurate than those obtained using GP, regression, or several CFRP confinement models found in the literature. The GP/OLS-based formula is simple and straightforward, and provides a valuable tool for analysis. ...
Harold Averkamp (CPA, MBA) has worked as a university accounting instructor, accountant, and consultant for more than 25 years. He is the sole author of all the materials on AccountingCoach.com. ...
Activity: Tootie Fruities. 1.Each person grabs one hand full of Tootie Fruities and does a quantitative analysis of the that event (how much did you grab?). Discuss an exact procedure on how to grab the cereal. 2.Make a histogram of the data and discuss the shape center and spread of the data.
Cellular behavior in response to stimulatory cues is governed by information encoded within a complex intracellular signaling network. An understanding of how phenotype is determined requires the distributed characterization of signaling processes (e.g., phosphorylation states and kinase activities) in parallel with measures of resulting cell function. We previously applied quantitative mass spectrometry methods to characterize the dynamics of tyrosine phosphorylation in human mammary epithelial cells with varying human epidermal growth factor receptor 2 (HER2) expression levels after treatment with epidermal growth factor (EGF) or heregulin (HRG). We sought to identify potential mechanisms by which changes in tyrosine phosphorylation govern changes in cell migration or proliferation, two behaviors that we measured in the same cell system. Here, we describe the use of a computational linear mapping technique, partial least squares regression (PLSR), to detail and characterize signaling mechanisms
Roll compaction is gaining importance in pharmaceutical industry for the dry granulation of heat or moisture sensitive powder blends with poor flowing properties prior to tabletting. We studied the influence of microcrystalline cellulose (MCC) properties on the roll compaction process and the consecutive steps in tablet manufacturing. Four dissimilar MCC grades, selected by subjecting their physical characteristics to principal components analysis, and three speed ratios, i.e. the ratio of the feed screw speed and the roll speed of the roll compactor, were included in a full factorial design. Orthogonal projection to latent structures was then used to model the properties of the resulting roll compacted products (ribbons, granules and tablets) as a function of the physical MCC properties and the speed ratio. This modified version of partial least squares regression separates variation in the design correlated to the considered response from the variation orthogonal to that response. The ...
Date: Dec 12, 2011. Author: Score plots from the principal component analysis of gas chromatography data for wine extracts showed grouping trends that were influenced by storage time and temperature. PCA loading plots revealed that changes in chemical profiles were different for wines held at different storage temperatures. Storage time could be predicted accurately by partial least squares regression of the GC data, and, in general, the enological parameters could be predicted accurately from GC fingerprints.. Read More ...
Early detection of breast cancer is key to successful treatment and patient survival. We have previously reported the potential use of gene expression profiling of peripheral blood cells for early detection of breast cancer. The aim of the present study was to refine these findings using a larger sample size and a commercially available microarray platform. Blood samples were collected from 121 females referred for diagnostic mammography following an initial suspicious screening mammogram. Diagnostic work-up revealed that 67 of these women had breast cancer while 54 had no malignant disease. Additionally, nine samples from six healthy female controls were included. Gene expression analyses were conducted using high density oligonucleotide microarrays. Partial Least Squares Regression (PLSR) was used for model building while a leave-one-out (LOO) double cross validation approach was used to identify predictors and estimate their prediction efficiency. A set of 738 probes that discriminated breast cancer
Peatlands are important terrestrial carbon stores. Restoration of degraded peatlands to restore ecosystem services is a major area of conservation effort. Monitoring is crucial to judge the success of this restoration. Remote sensing is a potential tool to provide landscape-scale information on the habitat condition. Using an empirical modelling approach, this paper aims to use airborne hyperspectral image data with ground vegetation survey data to model vegetation abundance for a degraded upland blanket bog in the United Kingdom (UK), which is undergoing restoration. A predictive model for vegetation abundance of Plant Functional Types (PFT) was produced using a Partial Least Squares Regression (PLSR) and applied to the whole restoration site. A sensitivity test on the relationships between spectral data and vegetation abundance at PFT and single species level confirmed that PFT was the correct scale for analysis. The PLSR modelling allows selection of variables based upon the weighted ...
Currently, clinicians use the levels of the hormone receptors for estrogen (ER) and progesterone (PR) and amplifications of the ErbB2 receptor tyrosine kinase (RTK) to distinguish three groups of breast cancers and guide the choice of therapeutic treatments. In this work we show that a limited number of measurements characterizing breast cancer cell lines at the level of RTKs can be used to predict cell line sensitivity to therapeutic inhibitors.. We chose a panel of approximately 40 breast cancer cell lines, treated them with 22 different growth factors and cytokines at a saturating and subsaturating dose for 10, 30, and 90 min and measured the phosphorylation of key kinases downstream of the receptors. In addition, we measured the expression and phosphorylation levels of approximately 20 RTKs under serum starvation conditions. We used the measurements to predict the GI50 values of the cell lines in response to over 45 therapeutic inhibitors by partial least square regression. We found that our ...
Benthic algal biomass depends on a number of variables, including local factors such as the ionic water composition, nutrient availability, light, or water velocity, and large-scale factors such as the drainage area and land uses in the watershed. The relative roles of local and large-scale factors affecting benthic chlorophyll variation were analyzed in the Guadiana watershed by means of variation partitioning and partial least squares regression. The potential relevance of 20 physical, chemical and physiographical variables was analyzed throughout the watershed, and separately in three distinct geological units: upper watershed calcareous streams, streams with siliceous bedrocks, and the main river reaches. Our results suggest that many predictors of algal biomass are intercorrelated but have independent effects, and that their importance varies between geological units. Nutrient content and land uses exerted the largest influence on the pattern of chlorophyll-a variation in the three ecotypes ...
how to find the test statistics using chi square method?a professional bowler has been having trouble hitting the, Hire Statistics and Probability Expert, Ask Statistics Expert, Assignment Help, Homework Help, Textbooks Solutions
A total of 136,250 monthly test day milk records collected from 13,625 Iranian Holstein heifers (three times a day milking) calved between 1991 and 2001 and distributed over 264 herds were used to study the effects of some environmental factors influencing lacta-tion curve parameters as well as production characteristics. Wilminks function (Yt = W0 + W1t + W2e-0.05t) was fitted to individual lactations. Least squares analysis of variance in-dicated that the herd, year and month of calving had a significant effect on all traits under consideration. Correlation analysis showed that the parameter W0 had a negative and significant (p
Least squares inference in phylogeny generates a phylogenetic tree based on an observed matrix of pairwise genetic distances and optionally a weight matrix. The goal is to find a tree which satisfies the distance constraints as best as possible. The discrepancy between the observed pairwise distances D i j {\displaystyle D_{ij}} and the distances T i j {\displaystyle T_{ij}} over a phylogenetic tree (i.e. the sum of the branch lengths in the path from leaf i {\displaystyle i} to leaf j {\displaystyle j} ) is measured by S = ∑ i j w i j ( D i j − T i j ) 2 {\displaystyle S=\sum _{ij}w_{ij}(D_{ij}-T_{ij})^{2}} where the weights w i j {\displaystyle w_{ij}} depend on the least squares method used. Least squares distance tree construction aims to find the tree (topology and branch lengths) with minimal S. This is a non-trivial problem. It involves searching the discrete space of unrooted binary tree topologies whose size is exponential in the number of leaves. For n leaves there are 1 • 3 • ...
Egg production and layer bird mortality data obtained from 5 different private farms in Zaria within the sub-humid zone of Nigeria, over a six-year period (1994 - 1999) were subjected to least squares analysis to determine the effects of age, season and year. It was demonstrated that age had a highly significant influence (P,0.01) on egg production in that birds falling within the age group 30 - 39 weeks produced the highest number of eggs (3255 plus or minus 109), while birds over 100 weeks of age produced the least number of eggs (1206 plus or minus 412). Similarly, seasonal variation in egg production was also significant (P,0.05); the highest egg production (2926 plus or minus 90) was obtained during the early dry season, and the lowest (2423 plus or minus 95) during the late wet season. Mortality was generally low (0.0 - 0.9%) and not significantly different from 20 - 49 weeks of age (P,0.05). However, from 50 to 100 weeks of age, highly significant differences (P,0.01) in mortality were ...
Preface xi. 1 Introduction to Linear and Generalized Linear Models 1. 1.1 Components of a Generalized Linear Model 2. 1.2 Quantitative/Qualitative Explanatory Variables and Interpreting Effects 6. 1.3 Model Matrices and Model Vector Spaces 10. 1.4 Identifiability and Estimability 13. 1.5 Example: Using Software to Fit a GLM 15. Chapter Notes 20. Exercises 21. 2 Linear Models: Least Squares Theory 26. 2.1 Least Squares Model Fitting 27. 2.2 Projections of Data Onto Model Spaces 33. 2.3 Linear Model Examples: Projections and SS Decompositions 41. 2.4 Summarizing Variability in a Linear Model 49. 2.5 Residuals Leverage and Influence 56. 2.6 Example: Summarizing the Fit of a Linear Model 62. 2.7 Optimality of Least Squares and Generalized Least Squares 67. Chapter Notes 71. Exercises 71. 3 Normal Linear Models: Statistical Inference 80. 3.1 Distribution Theory for Normal Variates 81. 3.2 Significance Tests for Normal Linear Models 86. 3.3 Confidence Intervals and Prediction Intervals for Normal ...
In this investigation, different mixtures of realgar and orpiment particles were floated in a laboratory batch flotation cell and multivariate image analysis was used to estimate the arsenic content in the froths. The analysis was based on the froth colour, as well as extraction of three groups of textural features, namely those based on grey level co-occurrence matrices (GLCMs), wavelets and local binary patterns (LBPs). Collectively, these features provided better information on the arsenic content of the froths than any one of the individual groups of features. Partial least squares models could explain approximately 78% of the variance in the arsenic by using all the features combined. The colour content, particularly the green component in the red-green-blue (RGB) features, provided almost as much information as each of the three sets of texture-base features individually.. ...
Despite the growing importance of longitudinal data in neuroimaging, the standard analysis methods make restrictive or unrealistic assumptions (e.g., assumption of Compound Symmetry--the state of all equal variances and equal correlations--or spatially homogeneous longitudinal correlations). While some new methods have been proposed to more accurately account for such data, these methods are based on iterative algorithms that are slow and failure-prone. In this article, we propose the use of the Sandwich Estimator method which first estimates the parameters of interest with a simple Ordinary Least Square model and second estimates variances/covariances with the so-called Sandwich Estimator (SwE) which accounts for the within-subject correlation existing in longitudinal data. Here, we introduce the SwE method in its classic form, and we review and propose several adjustments to improve its behaviour, specifically in small samples. We use intensive Monte Carlo simulations to compare all considered
In computer assisted planning of radiation treatment and more specifically the software RayStation developed by RaySearch, certain kinds of model calibration problems arise. The process of solving these problems is called beam commissioning. Today beam commissioning is handled by optimizing subsets of the underlying parameters using a quasi-Newton algorithm. In this thesis we investigate the beam commissioning problem space for all of the parameters. We find that the variables are rather well behaved and therefor propose a method based on linearizing dose before scoring. This reduces the number of expensive function calls drastically and allows us to optimize with regard to all of the underlying parameters simultaneously. When using a least squares score function, the method coincides with the Gauss-Newton method, a well-known nonlinear least squares method with fast convergence properties if good starting points are available. We use this method applied to a weighted least squares approximation ...
Inferential models are widely used in the chemical industry to infer key process variables, which are challenging or expensive to measure, from other more easily measured variables. The aim of this paper is three-fold: to present a theoretical review of some of the well known linear inferential modeling techniques, to enhance the predictive ability of the regularized canonical correlation analysis (RCCA) method, and finally to compare the performances of these techniques and highlight some of the practical issues that can affect their predictive abilities. The inferential modeling techniques considered in this study include full rank modeling techniques, such as ordinary least square (OLS) regression and ridge regression (RR), and latent variable regression (LVR) techniques, such as principal component regression (PCR), partial least squares (PLS) regression, and regularized canonical correlation analysis (RCCA). The theoretical analysis shows that the loading vectors used in LVR modeling can be
Title: Least Squares (Monograph in English). Author: Gligorije Perovic. Publisher: published by the author, University of Belgrade, Faculty of Civil Engineering, Belgrade. ISBN: 86-907409-0-2. Year: 2005. Price: Euro 120 (including postage). Pages: 648. Details: Hardcover, 87 Figures and 90 Tables, nicely typeset on high-quality paper. This book is a very comprehensive introduction to least-squares methods, including classical topics such as least-squares adjustment, but also contemporary methods such as statistical tests, Bayes estimation, Kalman filtering, least-squares collocation, time series, and optimal planning of experiments. Thus it has the character of an encyclopaedic monograph.. The book is intended for all scientists and engineers who perform highly precise measurements and who wish to analyze them by sophisticated linear models of mathematical data processing, in particular for geodetic scientists and engineers.. The required mathematics is on the level of a graduate engineering ...
The scaling of metabolic rates to body size is widely considered to be of great biological and ecological importance, and much attention has been devoted to determining its theoretical and empirical value. Most debate centres on whether the underlying power law determining metabolic rates is 2/3 (as predicted by scaling of surface area/volume relationships) or 3/4 (Kleibers Law). Although recent evidence suggests that empirically derived exponents vary among clades with radically different metabolic strategies, such as ectotherms and endotherms, models, such as the Metabolic Theory of Ecology, depend on the assumption that there is at least a predominant, if not universal, metabolic scaling exponent. Most analyses claimed to support the predictions of general models however fail to control for phylogeny. We used phylogenetic generalised least squares models to estimate allometric slopes for both basal metabolic rates (BMR) and field metabolic rates (FMR) in mammals. Metabolic rate scaling ...