CoDAS
https://codas.org.br/article/doi/10.1590/2317-1782/20232022098en
CoDAS
Original Article

Reliability of a script agreement test for undergraduate speech-language therapy students

Validación de constructo y reproducibilidad de una prueba de concordancia de guiones para estudiantes de pregrado en fonoaudiología

Angélica Pilar Silva Ríos; Manuel Nibaldo del Campo Rivas; Patricia Katherine Kuncar Uarac; Víctor Antonio Calvo Sprovera

Downloads: 0
Views: 427

Abstract

Purpose

To estimate the reliability of scripts designed for undergraduate Speech-Language Therapy students.

Methods

A descriptive cross-sectional study was carried out. Qualitative variables were summarized by frequency or proportion and quantitative through means (CI 95%). Reliability was estimated through Cronbach's α coefficient, and inter-rater agreement was determined using Fleiss’s Kappa index. The analytical tests considered a significance level of p<0.05.

Results

80 scripts organized in four areas of speech-language therapy were validated by 41 speech-language pathologists. The average experience of the professionals was 17.1 years. The reliability of the corpus was α: 0.67 (min= 0.34; max: 0.84), and the inter-rater agreement κ: 0.29 (min: 0.07; max: 0.45).

Conclusion

The corpus's reliability scores were similar to those reported by previous studies in different health professions. Having validated strategies aimed at developing proficiency and supporting classic training actions in undergraduate courses will contribute to increasing the quality of future health professionals.

Keywords

Reliability; Reproducibility of Results; Higher Education; Speech-Language Pathology; Professional Competence

Resumen

Objetivo

Estimar la confiabilidad y reproducibilidad de un corpus de scripts diseñado para el pregrado de fonoaudiología.

Método

Estudio observacional de tipo descriptivo y temporalidad transversal. Se estimó la validez de constructo a partir del coeficiente α de Cronbach y la reproducibilidad con el índice Kappa de Fleiss. Las pruebas analíticas consideraron un nivel de significancia p<0.05.

Resultados

Se creó un corpus de 80 scripts organizados en 4 áreas de la fonoaudiología el que fue validado por 41 fonoaudiólogos. La experiencia promedio de los profesionales fue de 17.1 años (Std. Err: 2.4; IC 95%: 11.7-22.6). La confiabilidad del corpus fue α: 0.67 y el acuerdo interevaluador, κ: 0.29.

Conclusión

Los puntajes de confiabilidad y reproducibilidad del corpus creado fueron similares a los reportados por estudios previos en otras profesiones de la salud. Contar con estrategias validadas que se orienten al desarrollo de competencias y complementen las acciones formativas, contribuirá a incrementar la calidad en la formación de futuros profesionales de la salud.

Palabras clave

Confiabilidad; Reproducibilidad de los Resultados; Educación Superior; Fonoaudiología; Competencia Profesional

References

  1. Moore P, Leighton MI, Alvarado C, Bralic C. Simulated patients in health care training: the human side of simulation. Rev Med Chil. 2016;144(5):617-25. http://dx.doi.org/10.4067/S0034-98872016000500010 PMid:27552013.
  2. Huhn K, Gilliland S, Black L, Wainwright S, Christensen N. Clinical reasoning in physical therapy: a concept analysis. Phys Ther. 2019;99(4):440-56. http://dx.doi.org/10.1093/ptj/pzy148 PMid:30496522.
  3. Thampy H, Willert E, Ramani S. Assessing clinical reasoning: targeting the higher levels of the pyramid. J Gen Intern Med. 2019;34(8):1631-6. http://dx.doi.org/10.1007/s11606-019-04953-4 PMid:31025307.
  4. Vega Rodríguez YE, Torres Rodríguez AM, del Campo Rivas MN. Análisis del rol del Fonoaudiólogo(a) en el sector salud en Chile. Cienc Trab. 2017;19(59):76-80. http://dx.doi.org/10.4067/S0718-24492017000200076
  5. del Campo M, Silva-Ríos A, Valdés J. Perspectivas y desafíos de los fonoaudiólogos en la actividad académica en Chile: una descripción preliminar. Rev Chil Fonoaudiol. 2019;18:1-10. http://dx.doi.org/10.5354/0719-4692.2019.55330
  6. Linn A, Khaw C, Kildea H, Tonkin A. Clinical reasoning: a guide to improving teaching and practice. Aust Fam Physician. 2012;41(1-2):18-20. PMid:22276278.
  7. Delany C, Golding C. Teaching clinical reasoning by making thinking visible: an action research project with allied health clinical educators. BMC Med Educ. 2014;14(1):20. http://dx.doi.org/10.1186/1472-6920-14-20 PMid:24479414.
  8. Yazdani S, Hosseinzadeh M, Hosseini F. Models of clinical reasoning with a focus on general practice: a critical review. J Adv Med Educ Prof. 2017;5(4):177-84. PMid:28979912.
  9. Charlin B, Tardif J, Boshuizen HP. Scripts and medical diagnostic knowledge: theory and applications for clinical reasoning instruction and research. Acad Med. 2000;75(2):182-90. http://dx.doi.org/10.1097/00001888-200002000-00020 PMid:10693854.
  10. Fayol M, Monteil JM. The notion of script: from general to developmental and social psychology. Cah Psychol Cogn Psychol Cogn. 1988;8(4):335-61.
  11. Delavari S, Monajemi A, Baradaran HR, Myint PK, Yaghmaei M, Soltani Arabshahi SK. How to develop clinical reasoning in medical students and interns based on illness script theory: an experimental study. Med J Islam Repub Iran. 2020;34:9. http://dx.doi.org/10.47176/mjiri.34.9 PMid:32284933.
  12. Iravani K, Amini M, Doostkam A, Dehbozorgian M. The validity and reliability of script concordance test in otolaryngology residency training. J Adv Med Educ Prof. 2016;4(2):93-6. PMid:27104204.
  13. Nazim SM, Talati JJ, Pinjani S, Biyabani SR, Ather MH, Norcini JJ. Assessing clinical reasoning skills using Script Concordance Test (SCT) and extended matching questions (EMQs): a pilot for urology trainees. J Adv Med Educ Prof. 2019;7(1):7-13. PMid:30697543.
  14. Wan SH. Using the script concordance test to assess clinical reasoning skills in undergraduate and postgraduate medicine. Hong Kong Med J. 2015;21(5):455-61. http://dx.doi.org/10.12809/hkmj154572 PMid:26314569.
  15. Custers EJFM. The script concordance test: an adequate tool to assess clinical reasoning? Perspect Med Educ. 2018;7(3):145-6. http://dx.doi.org/10.1007/S40037-018-0437-6 PMid:29904901.
  16. See KC, Tan KL, Lim TK. The script concordance test for clinical reasoning: re-examining its utility and potential weakness. Med Educ. 2014;48(11):1069-77. http://dx.doi.org/10.1111/medu.12514 PMid:25307634.
  17. Lineberry M, Kreiter CD, Bordage G. Threats to validity in the use and interpretation of script concordance test scores. Med Educ. 2013;47(12):1175-83. http://dx.doi.org/10.1111/medu.12283 PMid:24206151.
  18. Arceo M, Durante E. Desarrollo y evaluación de los scripts durante la formación profesional. Rev Hosp Ital B Aires [Internet]. 2013 [citado 2022 abril 28];33(4):144-52. Disponible en: https://www.hospitalitaliano.org.ar/multimedia/archivos/noticias_attachs/47/documentos/16200_144-152-HI-4-9-Ed%20Medica-Arceo-E.pdf
  19. Fournier JP, Demeester A, Charlin B. Script concordance tests: guidelines for construction. BMC Med Inform Decis Mak. 2008;8(1):18. http://dx.doi.org/10.1186/1472-6947-8-18 PMid:18460199.
  20. Dory V, Gagnon R, Vanpee D, Charlin B. How to construct and implement script concordance tests: insights from a systematic review. Med Educ. 2012;46(6):552-63. http://dx.doi.org/10.1111/j.1365-2923.2011.04211.x PMid:22626047.
  21. Manterola C, Grande L, Otzen T, García N, Salazar P, Quiroz G. Confiabilidad, precisión o reproducibilidad de las mediciones: métodos de valoración, utilidad y aplicaciones en la práctica clínica. Rev Chilena Infectol. 2018;35(6):680-8. http://dx.doi.org/10.4067/S0716-10182018000600680 PMid:31095189.
  22. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33(1):159-74. http://dx.doi.org/10.2307/2529310 PMid:843571.
  23. Gagnon R, Lubarsky S, Lambert C, Charlin B. Optimization of answer keys for script concordance testing: should we exclude deviant panelists, deviant responses, or neither? Adv Health Sci Educ Theory Pract. 2011;16(5):601-8. http://dx.doi.org/10.1007/s10459-011-9279-2 PMid:21286807.
  24. Wan MSH, Tor E, Hudson JN. Construct validity of script concordance testing: progression of scores from novices to experienced clinicians. Int J Med Educ. 2019;10:174-9. http://dx.doi.org/10.5116/ijme.5d76.1eee PMid:31562807.
  25. Deschênes MF, Charlin B, Gagnon R, Goudreau J. Use of a script concordance test to assess development of clinical reasoning in nursing students. J Nurs Educ. 2011;50(7):381-7. http://dx.doi.org/10.3928/01484834-20110331-03 PMid:21449528.
  26. Gawad N, Wood TJ, Cowley L, Raiche I. How do cognitive processes influence script concordance test responses? Med Educ. 2021;55(3):354-64. http://dx.doi.org/10.1111/medu.14416 PMid:33185303.
  27. Faucher C, Dufour-Guindon MP, Lapointe G, Gagnon R, Charlin B. Assessing clinical reasoning in optometry using the script concordance test. Clin Exp Optom. 2016;99(3):280-6. http://dx.doi.org/10.1111/cxo.12354 PMid:27087346.
  28. Karila L, François H, Monnet X, Noel N, Roupret M, Gajdos V, et al. The script concordance test: a multimodal teaching tool. Rev Med Interne. 2018;39(7):566-73. http://dx.doi.org/10.1016/j.revmed.2017.12.011 PMid:29576195.
  29. Subra J, Chicoulaa B, Stillmunkès A, Mesthé P, Oustric S, Rougé Bugat ME. Reliability and validity of the script concordance test for postgraduate students of general practice. Eur J Gen Pract. 2017;23(1):208-14. http://dx.doi.org/10.1080/13814788.2017.1358709 PMid:28819998.
  30. Kow N, Walters MD, Karram MM, Sarsotti CJ, Jelovsek JE. Assessing intraoperative judgment using script concordance testing through the gynecology continuum of practice. Med Teach. 2014;36(8):724-9. http://dx.doi.org/10.3109/0142159X.2014.910297 PMid:24819908.
  31. Lubarsky S, Charlin B, Cook DA, Chalk C, van der Vleuten CPM. Script concordance testing: a review of published validity evidence. Med Educ. 2011;45(4):329-38. http://dx.doi.org/10.1111/j.1365-2923.2010.03863.x PMid:21401680.
     
65a86555a953954ce3295c98 codas Articles

CoDAS

Share this page
Page Sections