List of Tables
List of Figures
List of Boxes
List of Abbreviations and Acronyms
Foreword by Richard C. Kunkel
Preface
Acknowledgments
About the Authors
1. What Are Dispositions and Why Should We Measure Them?
What This Chapter Is About
The Importance of Measuring Dispositions
The Challenge
What Are Standards-Based Dispositions?
Hierarchical Relationships Among Knowledge, Skills, and Dispositions
Remembering Bloom
Dispositions and Accreditation Requirements: Requirements and Definitions
Measuring Dispositions: Sources of Confusion
Measuring Dispositions: Morals, Ethics, or Standards Based?
Different Construct, Different Assessments, Similar Assessment Design Process
Wrap Up
Activity 1.1: Questions for Exploration
Activity 1.2: What Have You Noticed?
Activity 1.3: Assessment Belief Scale
Activity 1.4: Cognitive, Affective, and Psychomotor Objectives and Assessments
2. Methods for Assessing Dispositions
What This Chapter Is About
A Conceptual Framework for Measuring Dispositions
Measuring Teacher Dispositions: The State of the Art
Back to Basics: Bloom and Krathwhol
Available Methods for Measuring Dispositions or Affect
The Importance of Inference in Measuring Dispositions
Wrap Up
Activity 2.1: Questions for Exploration
Activity 2.2: Bloom and the INTASC Principles
Activity 2.3: Field Work
Activity 2.4: Review Your Feelings
3. DAATS Step 1: Assessment Design Inputs
Where We Have Been So Far
What This Chapter Is About
Why Are Purpose, Use, Propositions, and Content So Important?
DAATS Step 1A: Define the Purpose(s) and Use(s) of the System
DAATS Step 1B: Define the Propositions or Principles That Guide the System
DAATS Step 1C: Define the Conceptual Framework or Content of the System
DAATS Step 1D: Review Local Factors That Impact the System
Wrap Up
Worksheet 3.1: Purpose, Use, Propositions, Content, and Context Checksheet
Worksheet 3.2: Purpose, Use, Content, Draft
Worksheet 3.3: Propositions
Worksheet 3.4: Contextual Analysis
4. DAATS Step 2: Planning With a Continuing Eye on Valid Assessment Decisions
Where We Have Been So Far
What This Chapter Is About
DAATS Step 2A: Analyze Standards and Indicators
All Those Indicators
Why Bother?
DAATS Step 2B: Visualize the Teacher Demonstrating the Affective Targets
DAATS Step 2C: Select Assessment Methods at Different Levels of Inference
DAATS Step 2D: Build an Assessment Framework Correlating Standards and Methods
Wrap Up
DAATS
Worksheet 4.1: Organizing for Alignment (Version 1)
Worksheet 4.2: Organizing for Alignment (Version 2)
Worksheet 4.3: Visualizing the Dispositional Statements
Worksheet 4.4: Selecting Assessment Methods for INTASC Indicators
Worksheet 4.5: Assessment Methods for INTASC Indicators: Blueprint
Worksheet 4.6: Cost/Benefit and Coverage Analysis of Assessment Methods
5. DAATS Step 3: Instrument Development
Where We Have Been So Far
What This Chapter Is About
DAATS Step 3A: Draft items for Each Instrument
Thurstone Agreement Scales
Questionnaires, Interviews, and Focus Groups
Observed Performance
Thematic Apperception Tests or Situation Reflection Assessment
DAATS Step 3B: Review Items for Applicability to Values, Domain Coverage, Job Relevance
Wrap Up
Worksheet 5.1: Creating Scales
Worksheet 5.2: Creating Questionnaires, Interviews, or K-12 Focus Group Protocols
Worksheet 5.3: Creating an Affective Behaviour Checklist
Worksheet 5.4: Creating an Affective Behaviour Rating Scale
Worksheet 5.5: Creating a Tally Sheet for Affective Observation
Worksheet 5.6: Checklist for Reviewing Scale Drafts
Worksheet 5.7: Review Sheets for Questionnaires and Interviews
Worksheet 5.8: Review Sheets for K-12 Focus Group Protocols
Worksheet 5.9: Checklist for Reviewing Observations and Behavioral Checklists
Worksheet 5.10: Coverage Check
Worksheet 5.11: Rating Form for Stakeholder Review
6. DAATS Step 4: Decision Making and Data Management
Where We Have Been So Far
What This Chapter Is About
DAATS Step 4A: Develop Scoring Rubrics
Dichotomous Response Scoring Keys
Rating Scale Rubrics
DAATS Step 4B: Determine How Data Will Be Combined and Used
Need for Shared Data
Data Storage
Data Aggregation
Maximizing the Utility of the Data for Decision Making
DAATS Step 4C: Develop Implementation Procedures and Materials
Preponderance of the Evidence vs. Cut Scores
Advising and Due Process
Scoring Procedures
Implementation
Wrap Up
Worksheet #6.1: Explanation of Dichotomous Scoring Decisions
Worksheet #6.2: Rubric Design
Worksheet #6.3: Sample Format for Candidate/Teacher Tracking Form
Worksheet #6.4: Format for Data Aggregation
Worksheet #6.5: Sample Disposition Event Report
Worksheet #6.6: Management Plan
7. DAATS Step 5: Credible Data
Where We Have Been So Far
What This Chapter Is About
What Is Psychometric Integrity and Why Do We Have to Worry About It?
DAATS Step 5A: Create a Plan to Provide Evidence of Validity, Reliability, Fairness, and Utility
Elements of a Plan
Element 7.1: Purpose and Use
Element 7.2: Construct Measured
Element 7.3: Interpretation and Reporting of Scores
Element 7.4: Assessment Specifications and Content Map
Element 7.5: Assessor/Rater Selection and Training Procedures
Element 7.6: Analysis Methodology
Element 7.7: External Review Personnel and Methodology
Element 7.8: Evidence of Validity, Reliability, and Fairness (VRF)
Psychometric Evidence Collected Already
Next Steps in Collecting Evidence of Validity, Reliability, and Fairness
Future Studies
DAATS Step 5B: Implement the Plan Conscientiously
Wrap Up
Worksheets and Examples
Worksheet 7.1: Assessment Specifications
Worksheet 7.2: Analysis of Appropriateness of Decisions for Teacher Failures
Worksheet 7.3: Analysis of Rehire Data
Worksheet 7.4: Program Improvement Record
Worksheet 7.5: Expert Rescoring
Worksheet 7.6: Fairness Review
Worksheet 7.7: Analysis of Remediation Efforts and EO Impact
Worksheet 7.8: Psychometric Plan Format
Example 1: Logistic Ruler for Content Validity
Example 2: Computation of the Lawshe (1975) Content Validity Ratio
Example 3: Disparate Impact Analysis
Example 4: Computation of Cohen's Kappa (1960) for Inter-rater Reliability
Example 5: Two Pearson Correlation Coefficients and Scatterplots: Disposition Scores Correlated with PRAXIS and Portfolio Scores
Example 6: Spearman Correlation Coefficient and Scatterplot: Disposition Scores Correlated With Principal Ratings
Example 7: Correlation Matrix and Scatterplots Knowledge, Impact, Dispositions, Skills (KIDS)
Example 8: T-Test Comparing Dispositions of Mathematics and Science Teachers
Example 9: DIF Analysis for Programs
8. Using Teacher Scores for Continuous Improvement
What This Chapter Is About
Reasons Why We Use the Rasch Model
The Classical Approach
A Quick Overview of Where Rasch Fits Into the Grand Scheme of IRT Models
Rasch: The Basics
Getting Started
Differences That Item Writers Make
Guttman Scaling
A Sample Rasch Ruler
From Pictures to Numbers
The Fit Statistic
Gain Scores " Real or Imagined'
Ratings and Raters
Learning More About Rasch
Wrap Up
Activity #1: Decision-Making Tool for Measurement
9. Legal Integrity
What This Chapter Is About
Why Not Portfolios?
Why the Pied Piper?
What IF?? A Legal Scenario: Mary Beth JoAnne Sues XYZ University
MBJ Helps Us to Understand the Convergency of Psychometrics and Legal Requirements
Background Facts
Scenario #1
Scenarios #2, 3, and 4
Psychometric Issues and Legal Challenges in the Real World
Legal Issues and Precedents
Three Landmark Dispositions Cases in Two Years
Tide Changing in NCATE
Standards Are the Vanguard!
MBJ Revisited
End Note
Resource I. DAATS Steps and Worksheets
Resource II. INTASC Disposition Indicators
Glossary
Index