TY - JOUR
T1 - Creating an Instrument to Measure Student Response to Instructional Practices
AU - DeMonbrun, Matt
AU - Finelli, Cynthia J.
AU - Prince, Michael
AU - Borrego, Maura
AU - Shekhar, Prateek
AU - Henderson, Charles
AU - Waters, Cindy
N1 - Publisher Copyright:
© 2017 ASEE
PY - 2017/4
Y1 - 2017/4
N2 - Background: Calls for the reform of education in science, technology, engineering, and mathematics (STEM) have inspired many instructional innovations, some research based. Yet adoption of such instruction has been slow. Research has suggested that students' response may significantly affect an instructor's willingness to adopt different types of instruction. Purpose: We created the Student Response to Instructional Practices (StRIP) instrument to measure the effects of several variables on student response to instructional practices. We discuss the step-by-step process for creating this instrument. Design/Method: The development process had six steps: item generation and construct development, validity testing, implementation, exploratory factor analysis, confirmatory factor analysis, and instrument modification and replication. We discuss pilot testing of the initial instrument, construct development, and validation using exploratory and confirmatory factor analyses. Results: This process produced 47 items measuring three parts of our framework. Types of instruction separated into four factors (interactive, constructive, active, and passive); strategies for using in-class activities into two factors (explanation and facilitation); and student responses to instruction into five factors (value, positivity, participation, distraction, and evaluation). Conclusions: We describe the design process and final results for our instrument, a useful tool for understanding the relationship between type of instruction and students' response.
AB - Background: Calls for the reform of education in science, technology, engineering, and mathematics (STEM) have inspired many instructional innovations, some research based. Yet adoption of such instruction has been slow. Research has suggested that students' response may significantly affect an instructor's willingness to adopt different types of instruction. Purpose: We created the Student Response to Instructional Practices (StRIP) instrument to measure the effects of several variables on student response to instructional practices. We discuss the step-by-step process for creating this instrument. Design/Method: The development process had six steps: item generation and construct development, validity testing, implementation, exploratory factor analysis, confirmatory factor analysis, and instrument modification and replication. We discuss pilot testing of the initial instrument, construct development, and validation using exploratory and confirmatory factor analyses. Results: This process produced 47 items measuring three parts of our framework. Types of instruction separated into four factors (interactive, constructive, active, and passive); strategies for using in-class activities into two factors (explanation and facilitation); and student responses to instruction into five factors (value, positivity, participation, distraction, and evaluation). Conclusions: We describe the design process and final results for our instrument, a useful tool for understanding the relationship between type of instruction and students' response.
KW - active learning
KW - factor analysis
KW - instructional methods
KW - student resistance
UR - http://www.scopus.com/inward/record.url?scp=85017616852&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85017616852&partnerID=8YFLogxK
U2 - 10.1002/jee.20162
DO - 10.1002/jee.20162
M3 - Article
AN - SCOPUS:85017616852
SN - 1069-4730
VL - 106
SP - 273
EP - 298
JO - Journal of Engineering Education
JF - Journal of Engineering Education
IS - 2
ER -