BEA Assessment Boot Camp

Show Me the Data: Moving from Data to Report in Academic Assessment

April 22, 2017

An assessment report starts with one very important piece of information, the data received from individuals and faculty committees. These pieces of information illustrate the ways students in your program are learning. The data, created through both direct and indirect measures, using prompts, rubrics, score sheets, and surveys, move from faculty members to committees and into reports for your academic assessment office. These reports are eventually presented for dissemination to a variety of stakeholders and accrediting bodies. This year’s Assessment Boot Camp is geared toward strategies for gathering data, writing reports that reflect the data, and illustrating your institution’s academic success story.

 

1:30 -1:45 p.m.     Introductions and Welcome           CAA chair, Adam Kuban, Ball State University

 

2-2:20 p.m.            Conforming Program Assessment Measures to ACEJMC Values & Competencies

 

Between 2008 and 2016, more programs seeking ACEJMC accreditation were found out of compliance on Assessment of Learning Outcomes (Standard 9) than on any other accreditation standard.  This session will examine the relationship between program student learning outcomes and ACEJMC professional values and competencies.  The session will consider how direct and indirect measures may be used to assess learning outcomes as required by accreditation guidelines to improve curriculum and instruction.

Don A. Grady, Ph.D., Associate Dean, Elon University

 

2:20-2:30 p.m.      Break

 

2:30-2:50 p.m.      Using Survey Data in Program Assessment: Promise, Pitfalls and Process

 

Surveys (senior, alumni, internship, etc.) can be effective tools in assessing program effectiveness. However, not all surveys are equal. This session will discuss best practices, demonstrate how surveys fit into an overall assessment plan and illustrate how the data can be used to “close the loop.”

Mary Spillman, M.S., Associate Professor, Ball State University

 

2:50-3:10 p.m.      Experience assessing students work, progress, learning, and development? 

 

You probably have a great final project for your students. You might even have more than one. I’ll bet these projects are fantastic barometers for assessing your student progress, development and learning. How do you assess them after each lesson or each time in class? I’ll provide some guidance on assessing student more often during their time with you. I’ll get you thinking backwards for better assessment of student progress.

Shawn Montano, B.A., Instructor, Emily Griffith Technical College

 

3:10-3:20 p.m.      Break

 

3:20-3:40 p.m.      Assessment: It’s More Than Just the Data

 

Although assessment requires the collection of data and reporting it typically in Table form, more important is what you have to say in your report about the data analysis and actions to improve student learning.

John Turner, Ph.D., Professor, Towson University

 

3:40-4:00 p.m.      Dr. Strangessment: Or How I Learned to Stop Worrying and Love the Pretest/Posttest

 

This presentation will look at the use of pretest/posttest data for assessment of student learning and student attitudinal change in communication classes. It will present data from knowledge assessment classes in broadcasting and theatre. It will also involve a multi-semester examination of student attitudes on various issues in the news industry and culture and how they changed over the course of a semester in a press and society class.

Robert Spicer, Ph.D., Assistant Professor, Millersville University of Pennsylvania

 

4-4:10 p.m.            Break

 

4:10-4:30 p.m.      Break Out Group Q&A        Don Grady, Stacey Irwin, Shawn Montano, Rob Spicer, Mary Spillman, and John Turner

 

4:30-4:40               Closing Remarks Stacey Irwin