000 17546nam a22002657a 4500
003 CO-BoICC
005 20171030161958.0
008 171027t20092005xxu||||fr|||| 001 0 eng d
020 _a9781847879066
020 _a9781847879073
040 _aCO-BoICC
041 0 _aeng
082 0 4 _221
_a302.015195
_bF453d
100 1 _aField, Andy,
_d1973-
245 1 0 _aDiscovering statictics using SPSS :
_band sex and drugs and rock 'n' roll /
_cAndy Field
250 _aTercera edición
260 _aLos Angeles :
_bSAGE,
_c2009
300 _axxxii, 819 páginas ;
_c27 cm
505 0 _aWhy in my evil lecturer forcing me to leran statistics?. -- What the hell an I doing here? I don't belong here. -- The research process. -- Initial observation: finding something that needs explaining. -- Generating theories and testing them. -- Data collection 1: what to measure. -- Variables. -- Measurment error. -- Validity and reliability. -- Data collection 2: how to measure. -- Correlational research methods. -- Experimental research methods. -- Randomization. -- Analysing data. -- Frequency distributions. -- The centre of a distribution. -- Using a frequency distribution to go beyond the data. -- Fitting statistical models to the data. -- Everything you ever wanted to knoe about statistics (well, sort of). -- Building statisticals models. -- Population and samples. -- Simple statisticals models. -- Simple statistical models. -- The mean: a very simple statistical model. -- Assessing the fit of the mean: sums squares, variance and standard deviations. -- Expressing the mean as a model. -- Going beyond data. -- The standard error. -- Confidence intervals. -- Using statistical models to test research questions. -- Test statistics. -- One- and two-tailed test. -- Type I and Type II errors. -- Effect sizes. -- Statistical power. -- The SPSS environment. -- What will this chapter tell me?. -- Version of SPSS. -- Getting started. -- Teh data editor. -- Entering data into data editor. -- The 'Variable View'. -- Missing values. -- The SPSS viewer. -- The SPSS SmartViewer. -- The syntax window. -- Saving files. -- Retrieving a file. -- Exploring data with graphs. -- The art of presenting data. -- What makes a good graph?. -- Lies, damnes lies, and ... erm ... graphs. -- The SPSS Chart Builder. -- Histograms: a good way to spot obvious problems. -- Boxplots (box-whisker diagrams). -- Graphing means: bar charts and error bars. -- Simple bars charts for independent means. -- Cluestered bar charts for independent means. -- Simple bar charts for related means. -- Clustered bar charts for related means. -- Clustered bar charts for 'mixed' desings. -- Line charts. -- Graphing relationships: the scatterplot. -- Simple scatterplot. -- Grouped scatterplot. -- Simple and grouped 3-D scatterplots. -- Matrix scatterplot. -- Simple dot plot or density plot. -- Drop-line graph. -- Editing graphs. -- Exploring assumptions. -- What are assumptions?. -- Assumptions of parametric data. -- THe assumption of normally. -- Oh, no, it's that pesky frequency distribution again: checking normality visually. -- Quantifying normality with numbers. -- Exploring groups of data. -- Testing wheter a distribution is normal. -- Doing the Kolmogorov-Smirnov test on SPSS. -- Output from the explore procedure. -- Reporting the K-S test. -- Testing for homogenity of variance. -- Levene's test. -- Reporting Levene's test. -- Correcting problems in the data. -- Dealing with outliers. -- Dealing with on-normality and unequeal variances. -- Transforming the data using SPSS. -- When it all goes horribly wrong. -- Correlation. -- Looking at relationships?. -- How do we measure relationship?. -- A detour into the murky world of covariance. -- Standardization and the correlation coefficient. -- The significance of the correlation coefficient. -- Confidence intervals for r. -- A word of warning about interpretation: causality. -- DAta entry for correlation analysis using SPSS. -- Bivariate correlation. -- General procedure for running correlations on SPSS. -- Pearson's correlation coefficient. -- Spearmans's correlation coefficient. -- Kendall's tau (non-parametric). -- Biserial and point-biserial correlations. -- Partial correlation. -- The thory behind part and partial correlation. -- Partial correlation using SPSS. -- Semi-partial (or part) correlations. -- Comparing correlations. -- Coparing independents rs. -- Comparing dependent rs. -- Calculating the effect size. -- How to report correlation coefficients. -- Regression. -- An introduction to regression. -- Some important information about straight lines. -- THe method of least squares. -- Assessing the godness of fit: sums of squares, R and R². -- Assessing individual predictors. -- Doing simple regression on SPSS. -- Interpreting a simple regression. -- overall fit fo the model. -- Model parameters. -- Using the models. -- Multiple regression: the basics. -- An example of a multiple regression model. -- Sums of squares, R and R². -- Methods of regression. -- How accurate is my regression model?. -- Assessing the regression model I: diagnostics. -- Assessing the regression model II: generalization. -- How to do multiple ression using SPSS. -- Some things to think about before the analysis. -- Main options. -- Statistics. -- Regression plots. -- Saving regression diagnostics. -- Further options. -- Interpreting multiple regression. -- Descriptives. -- Summary of model. -- Model parameters. - Excluded variables. -- Assesinng the assumption of no multicollinearity. -- Casewise diagnostics. -- Checking assumptions. -- How to report multiple regression. -- Categorical predictors and multiple regression. -- Dummy coding. -- SPSS output for dummy variables. -- Logistic regression. -- Background to logistic regression. -- What are principles behind logictic regresion?. -- Assessing the model R and R². -- Assessing the contribution of predictors: the Wald statistics. -- The odds ratio: Exp(B). -- Methods of logistic regression. -- Assumtions adn things that can go wrong. -- Assumptions and things that can go wrong. -- Assumptions. -- Incomplete information from the predictors. -- Complete separation. -- Overdispersion. -- Binary logistic regression: an example that will make you feel eel. -- Teh main analysis. -- Method of regression. -- Categorical predictors. -- Obtaining residual. -- Further options. -- Interpreting logistic regression. -- The inial model. -- Step 1: intervention. -- Listing predicted probabilities. -- Interpreting residuals. -- Calculating the effect size. -- How to report logistic regression. -- Testing assumptions: another example. -- Testing for linearity of the logit. -- Testing for multicollinearity. -- Predicting several categories: multinominal regression. -- Running multinomial logistic regression in SPSS. -- Statistics. -- Other options. -- Interpreting the multinomial logistic regression output. -- Reporting the results. -- Comparing two means. -- Looking at differences. -- A problem with error bar graphs of repeated-measures desings. -- Step 1: calculate the mean for each participant. -- Step 2: calculate the grand mean. -- Step 3: calculate the adjtusment factor. -- Step 4: create adjusted values for each variable. -- The t-test. -- Rationale for the t-test. -- Assumption for the t-test. -- The dependent t-test. -- Sampling distributions and the standard error. -- The dependent t-test equation explained. -- The dependent t-test and the assumption of normality. -- Dependent t-test using SPSS. -- Output from the dependent t-test. -- Calculating the effect size. -- Reporting the dependent t-test. -- The independent t-test. -- The independent t-test equation explained. -- The independent t-test and the assumption of normality. -- The independent t-test using SPSS. -- Output from the independent t-test. -- Calculating the effect size. -- Reporting the independent t-test. -- Between groups or repeated measures?. -- The t-test as a general lilnear model. -- What if my data are not normally distributed?. -- Coparing several means: ANOVA (GLM 1). -- The theory behind ANOVA. -- Inflated error rates. -- Interpreting F. -- ANOVA as regression. -- Logic of the F-ratio. -- Total sum of squares (SST). -- Model sun of squares (SSM). -- Residual sum of squares (SSR). -- Mean squeares. -- The F-ratio. -- Assumptions of ANOVA. -- Planned contrasts. -- Post hoc procedures. -- Running one-way ANOVA on SPSS. -- Options. -- Output from one-way ANOVA. -- Output for the main analysis. -- Output for planned comparisons. -- Output for post hoc test. -- Calculating the effect size. -- Reporting results from one-way independent ANOVA. -- Violations of assumptions in one-way independent ANOVA. -- Analysis of covariance, ANCOVA (GLM 2). -- What is ANCOVA?. -- aSSUMPTIONS AND ISSUES IN ancova. -- Independence of the covariate and treatment affect. -- Homogeneity of regression slopes. -- Conducting ANCOVA on SPSS. -- Inputtind data. -- Initial considerations. testing the independence of the independent variable and covariate. -- The main analysis. -- Contrast and the other options. - Interpreting the output from ANCOVA. -- What happens when the covariate is escluded?. -- The main analysis. -- Contrasts. -- Interpreting the covariate. -- ANCOVA run as multiple regression. -- Testing the assumption of homo geneity of regression slopes. -- Calculating the effect size. -- Reporting results. -- What to do when assumptions are violated in ANCOVA. -- Factorial ANOVA (GLM 3). -- Theoty of factorial ANOVA (between-groups). -- Factorial desings. -- An example with two independent variables. -- Total sums of squares (SST). -- The model sum of squares (SSM). -- The residual sum of squares (SSR). -- The F-ratios. -- Factorial ANOVA using SPSS. -- Entering the data and accesing the main dialog box. -- Graphing interactions. -- Contrasts. -- Post hoc tests. -- Options. -- Output from factorial ANOVA. Output for the preliminary analysis. -- Levene's test. -- The main ANOVA table. -- Contrasts. -- Simple effects analysis. - Pos hoc analysis. -- Interpreting interaction interaction graphs. -- Caclcutating effect sizes. -- Reporting the results of two-way ANOVA. -- Factorial ANOVA as regression. -- What to do when assumption are violated in factorial ANOVA. -- Repeated-measures desings (GLM 4). -- Introduction to reated-measures desings. -- The assumption of sphericity. -- How is sphericity measured?. -- Assessing the severity of departures from sphericity. -- What is the effect of violating the ssumption of sphericity?. -- What do you do f you violate sphericity?. -- Theory of one-way repeated-measures ANOVA. -- The total sum of squares (SST). -- The within-participant (SSW). -- The model sum of squares (SSM). -- The residual sum of squares (SSR). -- The mean squares. -- The F-ratio. -- The between-partcipant sum of squares. -- One-way repeated-measures ANOVA using SPSS. -- The main analysis. -- Defining contrasts for repeated-measures. -- Post hoc test and additional options. -- Output for one-way repeated-measures ANOVA. -- Descriptives and other diagnostics. -- Assessing and correcting for sphericity: Mauchly's test. -- The main ANOVA. -- Contrasts. -- Post hoc tests. -- Effect sizes for repeated-measures ANOVA. -- Reporting one-way repeated-measures ANOVA. -- Repetead-measures with several independent variables. -- The main analysis. -- Contrasts. -- Simple effect analysis. -- Graphic interactions. -- Other options. -- Output for factorial repeated-measures ANOVA. -- Descriptives and main analysis. -- The effect of drink. -- The effect of imagery. -- The interaction effect (drink x imagery). -- Contrasts for repeated--measures ANOVA. -- Reporting the results from factorial repeated-measures ANOVA. -- What to do when assumtions are violated in repeated-measures ANOVA. -- Mixed desing ANOVA (GLM 5). -- Mixed desings. -- What do men and women look for in a partmer?. -- Mixed ANOVA on SPSS. -- The main analysis. -- Other options. -- Output for mixed factorial ANOVA: main analysis. -- The main effect and gender. -- THe main effect of looks. -- The main affect of charisma. -- The interaction between gender and looks. -- The interaction between gender and charisma. -- The interaction between attractivness and charisma. -- The interaction between looks, charisma and gender. -- Conclusions. -- Calculating effect sizes. -- Reporting the results of mixed ANOVA. -- What to do when the assumptions are violated in mixed ANOVA?. -- Non-parametric tests. -- When to use non-parametric tests. -- Comparing two independent conditions: the Wilcoxon rank-sum test and Mann-Whitney test. -- Theory. -- Unputting data and provisional analysis. -- Output from the Mann-Whitney test. -- Calculating an effect size. -- Writing the results. -- Comparing two related conditions: the Wilcoxon signed-rank test. -- Theory of the Wilicox signed-rank test. -- Running the analysis. -- Output for the ecstasy group. -- Output for the alcohol group. -- Calculating an effect size. -- Writing the results. -- Differences between several ndependent groups: the Kruskal-Wallis test. -- Theory of the Kruskal-Wallis test. -- Inputting data and provisional analysis. -- Doing the Kruskal-Wallis test on SPSS. -- Output from the Kruskal-Wallis test. -- Post hoc test for the Krusla-Wallis test. -- Testing for trends: the Jonkheere-Terpstra test. -- Calculating and effect size. -- Writing and interpreting results. -- Multivariate analysis of variance (MANOVA). -- When to use MANOVA. -- Introduction: similarities and differences to ANOVA. -- Words of warning. -- The example for this chapter. -- Theory of MANOVA. -- Introductionn to matrices. -- Some important matrices and their functions. -- Calculating MANOVA by hand: a worked example. -- Principle of MANOVA test statistics. -- Practical issues when conducting MANOVA. -- Assumptions and how to check them. -- Choosing a test statistic. -- Follow-up analysis. -- MANOVA on SPSS. -- The main analysis. - Multiple comparisons in MANOVA. -- Additional options. -- Output from MANOVA. -- Preliminary analysis and testing assumptions. -- MANOVA tests statistics. -- Unvariate test statistics SSCP Matrices. -- Contrasts. -- Reporting results from MANOVA. -- Following up MANOVA with dicriminant analysis. -- Output from the discriminant analysis. -- Reporting results from the discriminant analysis. -- Some final remarks. -- The final interpretation. -- Unvariate ANOVA or discriminant analysis?. -- What to do when assumptions are violated in MANOVA. -- Exploratory factor analysis. -- When to use factor analysis. -- Factors. -- Graphical representation of factors. -- Mathematical representation of factors. -- Factor scores. -- Discovering factors. -- Choosing a method. -- Communality. -- Factor analysis vs. principal component analysis. -- Theory behind principal component analysis. -- Factor extraction: eigenvalues and scree plot. -- Improving interpretation: factor rotation. -- Research example. -- Before you begin. -- Running the analysis. -- Factor extraction on SPSS. -- Rotation. -- Scores. -- Scores. -- Options. -- Interpreting output from SPSS. -- Preliminary analysis. -- Factor extraction. -- Factor rotation. -- Factor scores. -- Summary. -- How to report factor analysis. -- Reliability analysis. -- Measures of reliability. -- Interpreting Cronbach's ⍺ (some cautionary tales...). -- Reliability analysis on SPSS. -- Interpreting output. -- How to report reliability analysis. -- Categorical data. -- Analysing categorical data. -- Theory analysing categorical data. -- Pearson's chi-square test. -- Fischer's exact test. -- The likelihood ratio. -- yates' correction. -- Assumptions of the chi-square test. -- Doing chi-square on SPSS. -- Entering data: raw scores. -- Entering data: weight cases. -- Running the analysis. -- Output for the chi-squares test. -- Breaking down a significant chi-square test with standarized residuals. -- Calculating an effect size. -- Reporting the results of chi-square. -- Several categorical variables: loglinear analysis. -- Chi-square as regression. -- Assumptions in loglinear analysis. -- Loglinear analysis using SPSS. -- Initial considerations. -- The loglinear analysis. -- Output from loglinear analysis. -- Following up loglinear analysis. -- Effect sizes in loglinear analysis. -- Reporting the results of loglinear analysis. -- Multilevel linear models. -- Hierarchical data. -- The intraclass correlation. -- Benefits of multilevel models. -- Theory of multilevel linear models. -- An example. -- Fixed and random coefficients. -- The multilevel model. -- Assessing the fit and comparing multilevel models. -- Types of covariance structures. -- Some practical issues. -- Assumptions. -- SAmple size and power. -- Centring variables. -- Multilevel modelling on SPSS. -- Entering the data. -- Ignoring the data structure: ANOVA. -- Ignoring the data structure: ANCOVA. -- Factoring in the data structure: random intercepts. -- Factoring in the data structure: random intercepts ans slopes. -- Adding an interaction to the model. -- Growth models. -- Growth curves (polynomials). -- An example: the honeymoon period. -- Restructuring the data. -- Running a growth model on SPSS. -- Further analysis. -- How to report a multilevel model.
541 _aMERCAWORLD
_cCompra
_d11/07/2017
_hFactura - 7256
591 _anewadq08
650 1 7 _2LEMB
_aCiencias sociales
_xMétodos estadísticos
650 1 4 _aSPSS (Programa para computador)
942 _cBK
999 _c110563
_d110563