• Class 11 Physics Demo

Explore Related Concepts

how to read chi square table

From Wikipedia

Chi-square distribution

In probability theory and statistics, the chi-square distribution (also chi-squared or ) with kdegrees of freedom is the distribution of a sum of the squares of kindependentstandard normal random variables. It is one of the most widely used probability distributions in inferential statistics, e.g. in hypothesis testing or in construction of confidence intervals. When there is a need to contrast it with the noncentral chi-square distribution, this distribution is sometimes called the central chi-square distribution.

The best-known situations in which the chi-square distribution is used are the common chi-square tests for goodness of fit of an observed distribution to a theoretical one, and of the independence of two criteria of classification of qualitative data, and a third well known use is the confidence interval estimation for a population standard deviation of a normal distribution from a sample standard deviation. Many other statistical tests also lead to a use of this distribution, like Friedman's analysis of variance by ranks.

The chi-square distribution is a special case of the gamma distribution.

Definition

If X1, ..., Xk are independent, standard normal random variables, then the sum of their squares

Q\ = \sum_{i=1}^k X_i^2 is distributed according to the chi-square distribution with k degrees of freedom. This is usually denoted as

Q\ \sim\ \chi^2(k)\ \ \text{or}\ \ Q\ \sim\ \chi^2_k

The chi-square distribution has one parameter: kâ€” a positive integer that specifies the number of degrees of freedom (i.e. the number of Xiâ€™s)

Characteristics

Further properties of the chi-square distribution can be found in the box at right.

Probability density function

The probability density function (pdf) of the chi-square distribution is

f(x;\,k) = \frac{1}{2^{k/2}\Gamma(k/2)}\,x^{k/2 - 1} e^{-x/2}\, I_{\{x\geq0\}}, where Î“(k/2) denotes the Gamma function, which has closed-form values at the half-integers.

For derivations of the pdf in the cases of one and two degrees of freedom, see Proofs related to chi-square distribution.

Cumulative distribution function

F(x;\,k) = \frac{\gamma(k/2,\,x/2)}{\Gamma(k/2)} = P(k/2,\,x/2), where Î³(k,z) is the lower incomplete Gamma function and P(k,z) is the regularized Gamma function.

In a special case of k = 2 this function has a simple form:

F(x;\,2) = 1 - e^{-\frac{x}{2}}.

Tables of this distribution â€” usually in its cumulative form â€” are widely available and the function is included in many spreadsheets and all statistical packages. For a closed formapproximation for the CDF, see under Noncentral chi-square distribution.

It follows from the definition of the chi-square distribution that the sum of independent chi-square variables is also chi-square distributed. Specifically, if {Xi}i=1n are independent chi-square variables with {ki}i=1n degrees of freedom, respectively, then is chi-square distributed with degrees of freedom.

Information entropy

The information entropy is given by

H = \int_{-\infty}^\infty f(x;\,k)\ln f(x;\,k) \, dx = \frac{k}{2} + \ln\big( 2\Gamma(k/2) \big) + \big(1 - k/2\big) \psi(k/2), where Ïˆ(x) is the Digamma function.

Noncentral moments

The moments about zero of a chi-square distribution with k degrees of freedom are given by

\operatorname{E}(X^m) = k (k+2) (k+4) \cdots (k+2m-2) = 2^m \frac{\Gamma(m+k/2)}{\Gamma(k/2)}.

Cumulants

The cumulants are readily obtained by a (formal) power series expansion of the logarithm of the characteristic function:

\kappa_n = 2^{n-1}(n-1)!\,k

Asymptotic properties

By the central limit theorem, because the chi-square distribution is the sum of k independent random variables, it converges to a normal distribution for large k (k&nbsp;>&nbsp;50 is â€œapproximately normalâ€�). Specifically, if X&nbsp;~&nbsp;Ï‡Â²(k), then as k tends to infinity, the distribution of (X-k)/\sqrt{2k} tends<

Question:Here's the question: A heterozygous white-fruited squash plant is crossed with a yellow-fruited plant, yielding 200 seeds. Hypothesis: White is autosomal dominant to yellow. Of these, 110 produce white-fruited plants while only 90 produce yellow-fruited plants. Does the data support or not support the hypothesis? Explain using Chisquare analysis. When I attempted this question, I ended up with some weired ass (did I mention BIG) number :P help is very much appreciated! :D

Answers:I typed it into a chi square calculator on the net and got 2 for my chi value. At first, I didn't read the question right and got 42. Don't forget that because the white squash is heterzygous, we expect that there will be fifty-fifty chance of getting white or yellow fruit, not 3:1, like I originally thought. P-value, with df =1, is about 0.1..so roughly a ten percent chance of all of this being random...I'd say based on the chi value, it didn't support the hypothesis, but it depends on how accurate you want to be.

Question:Thanks for the previous questions answers, but what I am looking for is how to figure out and area of lets say .95 when all I am given in my table are areas for .10, .05,.025,.01,.005. I need to know how to figure out an area when all I have are these numbers.

Answers:A chi-square table would give you the values such that 0.10, 0.05, 0.025, and so on are to the right of those values. So use the value such that 0.05 is to the right.

Question:Sorry for the unscientific way im asking but what i mean is, does the table "vary" for different tests or (for example) will the P value for chi squared of .05, with 5 degree freedom always equal 11.07 no matter where you look?? thanks. Thanks for answer but i mean in a BIO 101 context, (so, yes or no?)

Answers:this is a good question: you have to be really careful how the table is set up. so suppose P(Xx) then 1-0.05=0.95. so then chi squared (subscript 0.95) is 1.145.

Question:The purpose of the Chi square X^2 test is to determine whether experimentally obtained data constitute a good fit to a theoretical, expected ratio. In other words, the X^2 test enables one to determine whether it is reasonable to attribute deviations from an expected value to chance. Obviously, if deviations are small then they can be more reasonably attributed to chance. The question is how small must deviations be in order to be attributed to chance? The formula for X^2 is as follows: X^2 = X (O-E)2 where O = the observed number of individuals of a particular phenotype, E = the expected number in that phenotype, and = the summation of all possible values of (O-E)2/E for the various phenotypic categories. The following is an example of how one might apply the test to genetics. In a cross of tall tomato plants to dwarf ones, the F1 consisted entirely of tall plants, and the F2 consisted of 102 tall and 44 dwarf plants. Does this data fit a 3:1 ratio? To answer this question, a 2 value can be calculated (see table 1). Table 1 PhenotypeGenotypeOE(O-E)(O-E)2(O-E)2/E TallT_102109.5-7.556.250.5137 Dwarftt4436.57.556.251.5411 Totals1461462.0548 The calculated 2 value is 2.0548. What does this mean? If the observed had equaled the expected, the value would have been zero. Thus, a small 2 value indicates that the observed and expected ratios are in close agreement. However, are the observed deviations within the limits expected by chance? In order to determine this, one must look up the 2 value on a Chi Square table (see table 1.2). Statisticians have generally agreed on the arbitrary limits of odds of 1 chance in 20 (probability = .05) for drawing the line between acceptance and rejection of the hypothesis as a satisfactory explanation of the data tested. A 2 value of 3.841 for a two-term ratio corresponds to a probability of .05. That means that one would obtain a 2 value of 3.841 due to chance alone on only about 5% of similar trials. When 2 exceeds 3.841 for a two term ratio, the probability that the deviations can be attributed to chance alone is less than 1 in 20. Thus, the hypothesis of compatibility between the observed and expected ratios must be rejected. In the example given the 2value is much less than 3.841. Therefore, one can attribute the deviations to chance alone. Table 1.2 Values of Chi Square probability of a larger value of X2 dF.995.990.975.950.900.750.500.250.100.050.025.010.005 1.04393.03157.03982.02393.0158.102.4551.322.713.845.026.637.78 2.0100.0201.0506.103.211.5751.392.774.615.997.389.2110.6 3.0717.115.216.352.5841.212.374.116.257.819.3511.312.8 4.207.297.484.7111.061.923.365.397.789.4911.113.314.9 5.412.554.8311.151.612.674.356.639.2411.112.815.116.7 The number of degrees of freedom is one minus the number of terms in the ratio. In the example above (3:1) there are two terms. Therefore, the degrees of freedom is 2 - 1 = 1. Application of the Chi Square Test M&Ms are carefully selected for equality of size and shape. Quantities of each color were packaged in your bag. Randomly select enough M&M to cover the bottom of the cup. Count the number of M&M of each color and record your data in table 1.3. Then calculate the expected numbers based on the sample size and the known ratio of M&M in general. Complete the table and calculateX^2 Table 1.3 Phenotype (color)OE(O-E)(O-E)2(O-E)2/E Sum How many degrees of freedom are there?____________________________________ Using table 1.2, what 2 values lie on either side of your calculated 2value?_________ What are the probability values associated with the 2 values?_____________________ Briefly interpret the X2 value you have just calculated.

Answers:lol you typed so much i had to make a comment :D.