A Framework for Reporting Performance on Problem- Solving Assessments: SPP Charts and Classroom Applications
PROCEEDINGS
Delwyn L. Harnisch, University of Illinois at Urbana- Champaign
Society for Information Technology & Teacher Education International Conference, Publisher: Association for the Advancement of Computing in Education (AACE), Waynesville, NC USA
Abstract
Many teacher educators and educational measurement specialists recognize that assessment needs to move from a task-based focus to a theory-based focus (Baker, O’Neil & Linn, 1993; Harnisch, 1994; Mislevy, 1993; Popham,
Citation
Harnisch, D.L. (1995). A Framework for Reporting Performance on Problem- Solving Assessments: SPP Charts and Classroom Applications. In J. Willis, B. Robin & D. Willis (Eds.), Proceedings of SITE 1995--Society for Information Technology & Teacher Education International Conference (pp. 180-184). Waynesville, NC USA: Association for the Advancement of Computing in Education (AACE). Retrieved March 28, 2024 from https://www.learntechlib.org/primary/p/46603/.
References
View References & Citations Map- DorrBremme, D.W. & Herman, J.L. (1986). Assessing Student Achievement: A profile of classroom practices. Los Angeles: Center for the Study of Evaluation, University of California.
- Haertel, E.H. (1991). Form and function in assessing science education. In G. Kulm & S.M. Malcom (Eds.), Science Assessment in the Service of Reform (pp.233-245). Washington,
- Harnisch, D.L. (1987). A multilevel evaluation system using student-problem charts. A paper presented at the Annual Meeting of the American Educational Research Association, Washington, D.C., April, 1987.
- Harnisch, D. (1994). Performance assessment in review: New directions for assessing student understanding. International Journal of Educational Research.
- Harnisch, D.L. & Connell, M.L. (Eds.) (1990). An Introduction to Educational Information Technology. Tokyo, Japan: NEC Corporation, NEC Technical College.
- Harnisch, D.L. & Linn, R.L. (1981). Analysis of item response patterns: Questionable test data and dissimilar curriculum practices. Journal of Educational Measurement, 18 (3), 133146.
- Harnisch, D.L. & Mabry, L.. (1993). Issues in the development and evaluation of alternative assessments. Journal of Curriculum Studies, 25 (2), 179-187.
- Harnisch, D.L. & Romy, N. (1985). User’s Guide for the Student-Problem Package (SPP) on the IBM-PC. Champaign, Illinois: Office of Educational Testing, Research and Service, University of Illinois at Urbana-Champaign.
- Linn, R.L., Baker, E.L., & Dunbar, S.B. (1991). Complex performance-based assessment: Expectations and validation criteria. Educational Researcher, 20(8), 15-21.
- Mislevy, R.J. (1993). Foundations of a new test theory. In N. Frederiksen, R.J. Mislevy, & I.I. Bejar (Eds.), Test theory for a new generation of tests. Hillsdale, NJ: Erlbaum.
- National Council of Teachers of Mathematics (1989). Curriculum and Evaluation Standards for school mathematics. Reston, VA: Author.
- National Research Council. (1993, February). National science education standards: An enhanced sampler (Working Paper of the National Committee on Science Education Standards and Assessment). Washington, D.C. Author.
- Popham, W.J. (1993). Educational testing in America: What’s right, what’s wrong?: A criterion-reference perspective. Educational Measurement: Issues and Practices, 12(1), 1114.
- Sato, T. (1990). The S-P Chart and the Caution Index. In Harnisch, D.L. & Connell, M.L. (Eds.), An Introduction to Educational Information Technology. Tokyo, Japan: NEC Corporation, NEC Technical College, 131-158.
- Sato, T. & Kurata (1977). BasicS-P score table characteristics. NEC Research and Development, 47, 64-71.
- Shavelson, R.J., Baxter,G.P., & Gao, X. (1993). Sampling variability of performance assessments. Journal of Educational Measurement, 30(3), 215-232.
These references have been extracted automatically and may have some errors. Signed in users can suggest corrections to these mistakes.
Suggest Corrections to References