International Association of Educators   |  ISSN: 2834-7919   |  e-ISSN: 1554-5210

Original article | International Journal of Progressive Education 2021, Vol. 17(1) 232-246

Evaluation with Multi-Surface Rasch Measurement Model of Performance Applications in Higher Education

Fatih Doğan & Dilek Tekin

pp. 232 - 246   |  DOI: https://doi.org/10.29329/ijpe.2021.329.15   |  Manu. Number: MANU-2003-21-0007

Published online: February 01, 2021  |   Number of Views: 216  |  Number of Download: 734


Abstract

The purpose of this research is to analyze the instructional materials prepared by chemistry teacher candidates (CTCs) in universities with the the multifaceted Rasch rating scale model (MRSM) by peer reviews. Also, it is aimed to determine the awareness of instructional technologies and material design courses among CTCs. The sample of this research group is composed of 8 CTCs who continue the undergraduate program of chemistry education in faculty of education in Çanakkale Onsekiz Mart University in the 2019-2020 academic years.  These CTCs are enrolled in teaching chemistry course-II and have already take the instructional technologies and material design course. The instructional material preparation skills of CTCs were determined by using a criteria form developed by considering the research project evaluation criteria of TÜBİTAK and the learning outcomes of the course. Results were analyzed according to MRSM. MRSM's surfaces are composed of 8 CTCs, 8 instructional materials and 15 criteria items. According to the results of the data, it was observed that CTCs were separated in terms of severity/leniency behaviour. Also while CTCs were having difficulties related to some criteria, it was observed that they met other criterisa. On the other hand, it was determined that the instructional material preparation skills of CTCs different.

Keywords: : Instructional Materials, Multi-Surface Rasch Measurement Model, Item Response Theory


How to Cite this Article?

APA 6th edition
Dogan, F. & Tekin, D. (2021). Evaluation with Multi-Surface Rasch Measurement Model of Performance Applications in Higher Education . International Journal of Progressive Education, 17(1), 232-246. doi: 10.29329/ijpe.2021.329.15

Harvard
Dogan, F. and Tekin, D. (2021). Evaluation with Multi-Surface Rasch Measurement Model of Performance Applications in Higher Education . International Journal of Progressive Education, 17(1), pp. 232-246.

Chicago 16th edition
Dogan, Fatih and Dilek Tekin (2021). "Evaluation with Multi-Surface Rasch Measurement Model of Performance Applications in Higher Education ". International Journal of Progressive Education 17 (1):232-246. doi:10.29329/ijpe.2021.329.15.

References
  1. Airasian, P. (1994). Classroom assessment, New York: McGraw Hill [Google Scholar]
  2. Akiyama, T. (2012). A close look at english teacher employment examinations (etees): how do raters assess?. Proceedings of The 17th Conference of Pan-Pacific Association of Applied Linguistics. Erişim: 20.12.2014, http://www.paaljapan.org/conference2012/ proc_PAAL2012/pdf/poster/P-15.pdf [Google Scholar]
  3. Apperson, J. M. Laws, E. L., & Scepansky, J. A. (2006). The impact of presentation graphics on students’ experience in the classroom. Computers and Education, 47(1), 116-126. [Google Scholar]
  4. Audrey, R. M-Q. (2008). Utilizing powerpoint presentation to promote fall prevention among older adults. The Health Educator, 40(1), 46-52. [Google Scholar]
  5. Ayre, C., & Scally A. J. (2014). Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Measurement and Evaluation in Counseling and Development, 47(1), 79–86.  [Google Scholar]
  6. Basturk R., (2008) Applying the many‐facet Rasch model to evaluate powerpoint presentation performance in higher education, Assessment & Evaluation in Higher Education, 33(4), 431-444.  [Google Scholar]
  7. Başturk, R. (2009). Applying the many-facet Rasch model to evaluate powerpoint presentation performance ın higher education, Assesment and Evaluation In Higher Education, 33(4), 431 – 444. [Google Scholar]
  8. Başturk, R. (2010). Bilimsel araştırma ödevlerinin çok-yüzeyli Rasch ölçme modeli ile değerlendirilmesi. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 1(1), 51-57. [Google Scholar]
  9. Best, J.W., & Kahn, J.V. (2006), Research in education (10th Ed.), Boston MA. Pearson [Google Scholar]
  10. Chang M.L. & Engelhard Jr. G. (2015), Examining the teachers’ sense of efficacy scale at the ıtem level with Rasch measurement model , Journal of Psychoeducational Assessment, 34(2), 177-191.  [Google Scholar]
  11. Cheng, W., & Warren, M. (1999). Peer and teacher assessment of the oral and written tasks of a group project. Assessment & Evaluation in Higher Education, 24(3), 301 [Google Scholar]
  12. Cohen, J. (1992). Statistical power analysis, Current Directions in Psychological Science, 1(3), 98-101. [Google Scholar]
  13. Çalışkan, H., & Kaşıkçı, Y. (2010). The application of traditional and alternative assessment and evaluation tools by teachers in social studies. Procedia Social and Behavioral Sciences 2, 4152–4156. [Google Scholar]
  14. DeMars, C. (2010). Item response theory. Oxford, UK: Oxford University Press. [Google Scholar]
  15. DiMartino, J., Castaneda, A., Brownstein, M. & Miles, S. (2007).  Authentic assessment. Principal’s Research Review, 2(4), 1-8. [Google Scholar]
  16. Eckes T., (2005) Examining rater effects in TestDaF writing and speaking performance assessments: A many-facet Rasch analysis, Language AssessmentQuarterly: An International Journal, 2(3), 197-221 [Google Scholar]
  17. Eckes, T. (2005). Examining rater effects ın testdaf writing and speaking performance assessments: a many facet Rasch analysis. Language Assessments Quarterly, 2, 197-221. [Google Scholar]
  18. Ekiz, D. (2009).  Bilimsel araştırma yöntemleri (Genişletilmiş 2.Baskı). Ankara: Anı Yayıncılık [Google Scholar]
  19. Elhan, A. H., & Atakurt, Y. (2005). Ölçeklerin değerlendirilmesinde niçin Rasch analizi kullanılmalı. Ankara Üniversitesi Tıp Fakültesi Mecmuası, 58, 47-50. [Google Scholar]
  20. Engelhard, G., & Myford, C.M. (2003). Monitoring faculty consultant performance in the Advanced Placement English Literature and Composition Program with a many-faceted Rasch model. ETS Research Report Series, (01). Princeton, NJ: Educational Testing Service [Google Scholar]
  21. Engelhard, G., Jr. (1992). The measurement of writing ability with a Many-Faceted Rasch model. Applied Measurement in Education, 5. 171-191. [Google Scholar]
  22. Farrokhi, F., Esfandiari, R., & Schaefer, E. (2012). A many-facet Rasch measurement of differential rater severity/leniency ın three types of assessment. JALT Journal, 34(1), 79-102. [Google Scholar]
  23. Gönen, M.E., Çelebi, E., & Işıtan, S., (2004). İlköğretim 5., 6. ve 7. sınıf öğrencilerinin okuma alışkanlıklarının incelenmesi, Milli Eğitim Dergisi, 164 [Google Scholar]
  24. Hamayan, E. V. (1995). Approaches to alternative assessment. Annual Rewiew of Applied Linguistics. 15, 212- 226. [Google Scholar]
  25. Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage [Google Scholar]
  26. Haney, W. & Madaus, G. (1989). Searching for alternatives to standardized tests: whys, whats, and whithers. Phi Delta Kappan, 70, 683–687. [Google Scholar]
  27. Iramaneerat, C., Myford, C.M., Yudkowsky, R., & Lowenstein, T. (2009).Evaluating the effectiveness of rating instruments for a communication skills assessment of medical residents. Advances in Health Sciences Education,14(4), 575-594 [Google Scholar]
  28. Kenyon, D. M., & Stansfield, C. W. (1992, April). Examining the validity of a scale used in a performance assessmentfrom many angles using the Many-Faceted Rasch Madel. Paper presented at the meeting of the American Educational Research Association, San Francisco, CA. [Google Scholar]
  29. Lawshe, C.H. (1975). A quantitative approach to content validity. Personnel Psychology, 28, 563–575 [Google Scholar]
  30. Lee, M., Peterson, J. J., & Dixon, A. (2010). Rasch calibration of physical activity self-efficacy and social support scale for persons with intellectual disabilities. Research in Developmental Disabilities, 31(4), 903−913.  [Google Scholar]
  31. Linacre, J.M., Wright B.D., & Lunz M.E. (1990). A facets model of judgmental scoring. Memo 61. MESA Psychometric Laboratory. University of Chicago.  [Google Scholar]
  32. Linacre, J. M. (1993). Generalizability theory and many facet Rasch measurement. Annual Meeting of The American Educational Research Association. (April, 13, 1993), (ED 364 573). Atlanta Georgia. [Google Scholar]
  33. Linacre, J.M. (1995). Rasch measurement transaction. MESA Press, Chichago, USA. [Google Scholar]
  34. Linacre, J.M. (2003). Size vs. significance: Standardized chi-square fit statistic. Rasch Measurement Transactions, 17(1), 918. [Google Scholar]
  35. Linn, R.L. & Gronlund, N.E. (1999). Measurement and assessment in teaching (7th Edn), Columbus, OH: Merill. [Google Scholar]
  36. Looney, M. A. (1997). A many-facet Rasch analysis of 1994 olympic figure skating scores [Abstract]. Research Quarterly for Exercise and Sport, 68(Suppl. I), A-53. [Google Scholar]
  37. Lumley, T. & McNamara, T.F. (1993). Rater characteristics and rater bias: implications for training. Paper presented at the Language Testing Research Colloquium, Cambridge, UK. ED: 365 091 [Google Scholar]
  38. Lunz, M. E., Wright, B. D., & Linacre, J. M. (1990). Measuring the impact of judge severity of examination scores. Applied Measurement in Education, 3, 331-345. [Google Scholar]
  39. Lynch, B. K., & McNamara, T. F. (1998). Using G-theory and many-facet Rasch measurement ın the development of performance assessments of the esl speaking skills of immigrants. Language Testing, 15(2), 158–180. [Google Scholar]
  40. Matsuno, S. (2009). Self-, peer-, and teacher-assessments in japanese university efl writing classrooms. Language Testing, 26(1), 75-100. [Google Scholar]
  41. Mearoff, G.I. (1991). Assessing alternative assessment. Phi Delta Kappan, 73(4), 272–281.  [Google Scholar]
  42. Neil, D.M. & Medina, N.J. (1989). Standardized testing: harmful to educational health. Phi Delta Kappan, 70, 688–697 [Google Scholar]
  43. Özbaşi D., & Arcagök S., (2019). An ınvestigation of pre-service preschool teachers’ projects using the many-facet Rasch model, International Journal of Progressive Education, 15(4), 157-173,  [Google Scholar]
  44. Park, H., Kim, H. S., Cha, Y. J., Choi, J., Minn, Y., Kim, K. S., & Kim, S. H. (2018). The effect of mental rotation on surgical pathological diagnosis. Yonsei medical journal, 59(3), 445-451. [Google Scholar]
  45. Rasch, G. (1980). Probabilistic models for some intelligence and attainment tests. Chicago, IL.: MESA Press. [Google Scholar]
  46. Rennert-Ariev, P. (2005). A theoretical model for the authentic assessment of teaching. Practical Assessment Research and Evaluation, 10(2), 1-11.  [Google Scholar]
  47. Sudweeks, R.R., Reeve, S., & Bradshaw, W.S. (2004). A comparison of generalizability theory and many-facet Rasch measurement in an analysis of college sophomore writing. Assessing Writing, 9(3), 239-261 [Google Scholar]
  48. Tomlinson, C.A. (2001). Grading for success. Educational Leadership, 3: 12–15. [Google Scholar]
  49. Toptaş, V. (2011). Sınıf öğretmelerinin matematik dersinde alternatif ölçme ve değerlendirme yöntemlerinin kullanımı ile ilgili algıları. Eğitim ve Bilim, 36(159), 205-219. [Google Scholar]
  50. TÜBİTAK (2018). Türkiye bilimsel ve teknolojik araştırma kurumu 1001-bilimsel ve teknolojik araştırma projelerini destekleme programı proje başvuru formu [http://tubitak.gov.tr/tr/destekler/akademik/ulusal-destek-programlari/1001/icerik-basvuruformlari ] web adresinden 2.2.2020 tarihinde indirildi.  [Google Scholar]
  51. Wiggins, G.P. (1989). A true test: toward more authentic and equitable assessment. Phi Delta Kappan, 70, 9, 703–713. [Google Scholar]
  52. Wolf, A. (1995). Authentic assessments in a competitive sector: institutional prerequisites and cautionary tales. In Evaluating authentic an Assessment, Edited by: Torrance, H. 78–87. Buckingham: Open University Press [Google Scholar]
  53. Yuzuak, A.V., Erten, S. & Kara, Y. (2019). Analysis of laboratory videos of science teacher candidates with many-facet Rasch measurement model. Journal of Education in Science, Environment and Health (JESEH), 5(2), 146-155.  [Google Scholar]
  54. Yüzüak, A. V., Yüzüak, B., & Kaptan, F. (2015). Performans görevinin akran gruplar ve öğretmen yaklaşımları doğrultusunda çok-yüzeyli Rasch ölçme modeli ile analizi. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 6(1), 1-11 [Google Scholar]
  55. Zemelman, S., Daniels, H. & Hyde, A. (1998). Best practices, Portsmouth, NH: Heinemann. [Google Scholar]