NewSeries(lngRow + 1, 1) = CDbl(TotX(lngRow)) ReDim NewSeries(1 To UBound(TotX) + 1, 1 To 2) TotY = Split(Left$(strY, Len(strY) - 1), strDelim) TotX = Split(Left$(strX, Len(strX) - 1), strDelim) StrY = strY & (y(lngRow) & strDelim, W(lngRow))
StrX = strX & (x(lngRow) & strDelim, W(lngRow)) If (UBound(x, 1) = UBound(y, 1)) And (UBound(x, 1) = UBound(W, 1)) Then Public Function LinestWeighted(xRng As Range, yRng As Range, wRng As Range, bInt As Boolean, bStat As Boolean) As Variant Note these values are uncentered versions (u) since the const=FALSE (refer to MS Help on LINEST for further info.) For the centered versions (c) we need to subtract the weighted average as below: SSTot(c) =SUMPRODUCT(C2:C7*(B2:B7-SUM(B2:B7*C2:C7)/SUM(C2:C7))^2) = 244.93īased on the additional information that you have tens of thousands of rows, here is a VBA UDF that will do the job (including the r2)Īs per the screenshot below it provides the same m, x and r2 values that my expanded data set did in my original answer Setting stats to TRUE in the LINEST output of the first formula we get in the third and fifth rows: SSres = 59.76 Since there is not a column of ones present, we must set const = FALSE and use two columns in the regressor matrix. The regression is now Wy on WX'=(W1,WX) where W is the diagonal matrix consisting of the square root of the weights. Now consider a weighted least squares regression. If const = FALSE the regressor matrix is simply X so running the regression with a column of ones included gives the same estimates as running without a column of ones and setting const=TRUE. If const = TRUE the regressor matrix is the augmented matrix consisting of a column of ones followed by the regressor columns i.e. Where, are the values of arbitrary data sets.With data in A2:C7, based on the standard weighted least squares formula, you can try: =LINEST(B2:B7*C2:C7^0.5,IF(,1,A2:A7)*C2:C7^0.5,0,1),3,1)Ĭonsider first a normal regression of y on X using LINEST. Where are the values of arbitrary data sets. ,where are are the error bar sizes stored in error bar columns. Note that y here stands for function parameter name and it is not referring to the dependent variable.īelow are y values of independent variable to be fit, are the y values of fitted curve. See the table below for the formula to calculate weight in each case. Origin supports a number of weighting methods, some weight methods can be used for both L-M and ODR algorithm while some can only be used for L-M. The weights will be used in the procedure of reducing Chi-Square, you may refer to the Iteration Algorithm for the formula used in different cases. When there are multiple input datasets, you can specify different weighting methods for each Y and/or X data.
When Iteration Algorithm is Levenberg Marquardt, it is only supported to add weight for Y data, while if it is Orthogonal Distance Regression (Pro Only), both X and Y weight are supported. So when selecting datasets for the fitting, you can also do weighting settings in the Data Selection page of the Settings tab to do weighted fitting.Īfter fitting, you will get the results with weighting as below: In some cases you may want certain data points to factor more heavily than others into the fitting calculations. 15.3.6.5 Fitting with Errors and Weighting