November 2013 Educational Researcher: Automated Test Construction Can Better Assess Student Mastery of Common Core State Standards
Issue Also Looks at How Principals’ Leadership Can Affect Student Learning, Challenges of Outcome-Reporting Bias in Education Research, and Principles for Stronger Research of Mixed Reality Instruction
Source Newsroom: American Educational Research Association (AERA)
Issue Also Looks at How Principals’ Leadership Can Affect Student Learning, Challenges of
Outcome-Reporting Bias in Education Research, and Principles for Stronger Research
of Mixed Reality Instruction
Newswise — WASHINGTON, D.C., November 13, 2013 ─ The November 2013 issue of Educational Researcher (ER), a peer-reviewed journal of the American Educational Research Association (AERA), is now available on the association’s website. Included in this issue is a report on an innovative algorithm for automated test construction that results in much more highly aligned – and therefore, more valid – assessments of student mastery of state content standards. The November issue of ER includes three feature articles and one essay. Links to the full text of each article are available through AERA’s website at www.aera.net/ERNov13.
This month’s feature articles include the following:
• “Constructing Aligned Assessments Using Automated Test Construction,” by Andrew Porter, Morgan S. Polikoff, Katherine M. Barghaus, and Rui Yang, presents an algorithm for automated test construction that yields assessments much more aligned to the Common Core State Standards (CCSS) than currently used measures. Alignment describes the strength of the relationship between what is tested and what test makers want to test; poorly aligned tests fail to provide teachers with information about the extent to which their instruction has helped their students learn core content. The versatile algorithm procedure proposed by Porter, Polikoff, Barghaus, and Yang allows for alignment to be a forethought of test construction, which will be important for the first assessments of CCSS by the 2014-15 academic year. Porter is a professor at the University of Pennsylvania: email@example.com, (215) 898-7014. Polikoff is an assistant professor at the University of Southern California: firstname.lastname@example.org, (213) 740-6741. Barghaus is a research associate at the University of Pennsylvania: email@example.com, (203) 313-0220. Yang is a Ph.D. candidate at the University of Pennsylvania: firstname.lastname@example.org, (317) 379-5292.
• “Effective Instructional Time Use for School Leaders: Longitudinal Evidence from Observations of Principals,” by Jason Grissom, Susanna Loeb, and Ben Master, examines the association between school principals’ instructional leadership behaviors and student achievement gains using in-person, full-day observations of approximately 100 urban principals in the Miami-Dade School District, collected over three school years. Instructional leadership is defined as those functions that support classroom teaching and student learning. Grissom, Loeb, and Master found that time spent on certain instructional functions – teacher coaching, evaluation, and developing the school’s educational program – is associated with positive achievement gains, while the most common instructional function used by principals – classroom walkthroughs – is negatively associated with student growth. Grissom is an assistant professor at Vanderbilt University: email@example.com, (615) 322-6441. Loeb is a professor at Stanford University: firstname.lastname@example.org, (650) 736-1258. Master is a doctoral candidate at Stanford’s Graduate School of Education: email@example.com, (617) 422-9075.
• “Outcome-Reporting Bias in Education Research,” by Therese D. Pigott, Jeffrey C. Valentine, Joshua R. Polanin, Ryan T. Williams, and Dericka D. Canada, examines outcome reporting bias in education research by looking at what is edited out of dissertations before they are submitted for publication. Pigott, Valentine, Polanin, Williams, and Canada found that statistically insignificant outcomes were 30 percent more likely to be omitted from a published study than statistically significant outcomes. If outcomes are censored from study reports, there is potential for conclusions drawn from incomplete evidence to be biased. Pigott is an associate dean and professor at Loyola University Chicago: firstname.lastname@example.org, (312) 915-6245. Valentine is an associate professor at the University of Louisville: email@example.com, (502) 852-3830. Polanin is a postdoctoral fellow at Vanderbilt University: firstname.lastname@example.org, (217) 369-2046. Williams is an assistant professor at the University of Memphis: email@example.com, (812) 391-2539. Canada is a diversity fellow and doctoral student at Boston College: firstname.lastname@example.org.
The essay in this issue, “Emboldened by Embodiment: Six Precepts for Research on Embodied Learning and Mixed Reality,” is by Robb Lindgren and Mina Johnson-Glenberg. Lindgren and Johnson-Glenberg propose six principles for researchers as they examine the emerging instructional method of mixed reality. Mixed reality is a class of immersive technologies, typically involving real-world objects, which mix the digital with the physical. Mixed reality has the potential to transform computer-assisted learning. The authors argue that establishing a rigorous program of research is essential to assessing this emerging instructional approach. Lindgren is an assistant professor at the University of Illinois: email@example.com, (217) 244-3655. Johnson-Glenberg is an associate research scientist at Arizona State University: firstname.lastname@example.org, (480) 307-6811.
The American Educational Research Association (AERA) is the largest national professional organization devoted to the scientific study of education. Founded in 1916, AERA advances knowledge about education, encourages scholarly inquiry related to education, and promotes the use of research to improve education and serve the public good. Find AERA on Facebook and Twitter.