Page 104 - JRCERT Update Articles
P. 104

JRCERT Update

                                                                                            Leggett, Eatmon





          make significant program or curricular changes based   Stephanie Eatmon, EdD, R.T.(R)(T), FASRT, is
          on data collected from 1 cohort of students, or based   faculty and consultant for National University in Costa
          on 1 unmet benchmark. This “knee jerk” reaction is   Mesa, California, and first vice chair of the JRCERT
          imprudent because there might not be strong, consis-  board of directors.
          tent evidence to support the change. With a consistent
          upward or downward trend over several years or     References
          cohorts, generally 3 cohorts, comes a strong prob-  1.   JRCERT standards for an accredited educational program.
          ability that change is warranted. If a negative trend is   Joint Review Committee on Education in Radiologic
          evident, corrective changes in the program should be   Technology website. http://wordpress-1182231-4152295.cloudwaysapps.com/programs
          implemented as soon as possible. If, on the other hand,   -faculty/jrcert-standards/. Published 2014. Accessed
          students routinely exceed a particular benchmark, a    October 31, 2016.
          complete change in the metric might be indicated to   2.   Ten data mistakes to avoid. InsightScope website. http://
                                                                 insightscope.com/10-data-analysis-mistakes-to-avoid/.
          provide evidence in another way that students are being   Accessed November 20, 2016.
          challenged to reach their highest potential.
            As part of the detailed perspective, data analysis can
          include drilling down into the outcome data to reveal a
          particular aspect of learning that is proving difficult for
          students. For example, if the data indicates that students
          can demonstrate effective oral communication in the
          clinical setting, but the evaluation rubric identifies they
          are not doing well with patient education, this specific
          component of communication can become an area of
          focus to be emphasized and monitored.

          Summary
            The true spirit of assessment is striving to improve
          student learning continually. When an assessment plan
          is implemented, it is essential to analyze the resulting
          outcome data carefully and completely to determine the
          effectiveness of the educational process. This is called
          closing the loop in assessment. It cannot be emphasized
          enough, however, that the quality of the data is depen-
          dent on the quality of the SLO, measurement tools, and
          data collection process. A well-developed assessment
          plan, coupled with a thorough analysis of the result-
          ing outcome data, completes the assessment cycle,
          promotes positive program changes, and ultimately
          enhances student success.



            Tricia Leggett, DHEd, R.T.(R)(QM), is associate professor
          for Zane State College in Zanesville, Ohio. She also is first vice
          chair of the JRCERT board of directors, vice chairman of the
          Radiologic Technology Editorial Review Board, and vice
          chairman of the Research Grant Advisory Panel.



          RADIOLOGIC TECHNOLOGY, May/June 2017, Volume 88, Number 5                                       547
          Reprinted with permission from the American Society of Radiologic Technologists for educational purposes. ©2019. All rights reserved.
   99   100   101   102   103   104   105   106   107   108   109