Monday, April 25, 2011

"Why Are We Assessing?"

By: Linda Suskie
AEFIS Response:
When approaching the tasks associated with outcomes-based assessment in higher education, we tend to skip the why and jump right into the how. And to take that jump yet another leap further, we begin to facilitate the how without defined goals that can be communicated to the stakeholders involved. Linda Suskie, Vice President of the Middle State Commission on Higher Education, captures these sentiments in her "Why Are We Assessing?" view. She urges those involved in the assessment of learning in higher education to recognize the succinct goals for assessment. "Assessment is simply a vital tool to help us make sure we fulfill the crucial promises we make to our students and society."

The promises that Suskie remarks on are in question by government policymakers, investors, accrediting agencies, and students and their parents, the consumers of higher education. So, how do we (a) define expectations and (b) demonstrate that we are delivering the expected outcomes for any given student to society?

AEFIS uses the Course Section Dashboard as the platform for faculty to develop effective contracts with students and other stakeholders. This document presents the what and why that is expected of students as form of course outcomes and professional contribution. These contractual documents are more commonly known as course syllabi. They aim to answer to more than what students are getting for their money, but also how they will be able to understand concepts and apply skills as they enter the workforce.

Telling students and stakeholders about the expected outcomes is one thing, but following through is another. This is where direct and indirect assessment come into play. The AEFIS Solution hosts mechanisms for entering and archiving student assessment data. With an end result of student outcomes transcripts -- students walk away from degree programs with more than diplomas, but documented evidence of their performance on outcomes.

This data is most valuable to students, but is invaluable to employers, government policymakers, accrediting agencies and the like. With this information we can start to explain the means for achieving effective personalized learning. Suskie challenges that "…we haven't figured out a way to tell the story of our effectiveness in 25 words or less, which is what busy people want and need…Today people want and need to know not about our star math students but how successful we are with our run of the mill students who struggle with math." We accept the challenge!

Effective learning starts with an understanding of expectations and progresses with continuous communication, evaluation, and revision of those expectations.

Interactive archival systems such as AEFIS serve as vehicles for effective instruction, by connecting assessment to teaching and learning.

Becky Joyce, AEFIS Team

Thursday, April 21, 2011

Assessing our Assessing

At our recent ABET Symposium workshop, we asked participants to self-assess their programs' assessment practices. Similarly to any classroom setting, there were a few partakers who were quick to share their institutional efforts and several who shied away from the questions. The group managed to come up with some interesting topics of discussion and open ended questions for their faculty and administrators.

Much of the room admitted that their curriculum mappings and student assessments are planned and warehoused on paper. For most of them, this results in boxes and boxes of hard copies and hundreds to thousands of man-hours for the preparation of an ABET accreditation visit. Although the room had representatives from many different schools, several individuals described analysis packages developed for assessment data. There was no conversation of shared tools or practices. The conference, itself, is meant to be a forum for cross-pollination of ideas and best practices, however, the schools remain trapped in the silo effect. And so we introduced AEFIS to get the conversation forward thinking.

Shifting the conversation from: how data is collected and stored – the logistics of assessment – to the real meat and potatoes of it: what data should we be collecting and how should we use it to improve student learning, got the audience more engaged!

The outcomes that ABET expects from students at the completion of degree programs can be difficult to assess and report on, especially if there is little or unsustainable infrastructure in place. Development of assessments can be trial and error based and begs for collaboration. So we dove right into some assessment activity questions:

How does your institution assess ethical components to report on outcomes/objectives?
  • Assessment Measures:
    • Scenario based test questions
    • Developed case studies
    • Field exercise interviews
  • …How is student performance rated / documented?
    • Against a rubric
  • …How often is such a rubric reviewed / adapted?
    • Rarely
How does your institution report on Program Educational Objectives?
  • Assessment Measures:
    • Student certifications post-graduation
  • …Does this demonstrate success or student learning?
    • Open for discussion
  • …How can we increase our response rates for alumni surveys?
    • Open for discussion

These and more questions are being posed by institutions as they plan their assessment efforts. And these questions only started the process of assessing our assessing.

We encourage you to review the questions that workshop participants considered. Download a copy of the workshop materials at our website. Please feel free to share your ideas and comments or let us know what questions we should be asking!

Monday, April 11, 2011

2011 ABET Symposium

AEFIS is looking forward to the 2011 ABET Symposium, April 14-16 in Indianapolis, IN.  Academic partners from Drexel University along with Mustafa Sualp, President of AEFIS and Becky Joyce, Operations Coordinator for AEFIS will be presenting a workshop, "Connecting Assessment to Teaching and Learning to Sustain Accreditation."

80-Minute Mini Workshop
Saturday, April 16, 2011
10:35AM-11:55AM


ABET, Inc. is a known accreditor for college and university programs in applied science, computing, engineering and technology.  They have accredited over 3,100 programs at more than 600 colleges and universities. 

Come visit us at our workshop where we will be available to answer any questions about AEFIS and share more about our AEFIS Partner Program.  Please take a few minutes to learn more about AEFIS at www.goAEFIS.com or contact the AEFIS Team directly at info@goAEFIS.com

Wednesday, April 6, 2011

What is AEFIS?

The AEFIS Team is dedicated to delivering all stakeholders - administrators, students, alumni, faculty, and industry members - up to date information about the foremost assessment and accreditation news and practices. The AEFIS Team expects no less than the best, and neither should you. That's why we have created the Academic Evaluation, Feedback and Intervention System – AEFIS, the web based academic assessment and accreditation management solution. AEFIS streamlines the accreditation process by documenting plans for continuous curriculum improvement and generating relevant, easy to read reports.

AEFIS was designed by a university for universities. In 2003, Drexel University's School of Biomedical Engineering, Science and Health Systems sought a solution for attaining ABET accreditation. That solution was AEFIS. In 2008, after successfully attaining accreditation, the School of Biomedical Engineering, Science and Health Systems received a National Science Foundation (NSF) grant for research efforts that included the continued development of AEFIS's student centered features. In 2009, Untra Corporation purchased the majority license for AEFIS in order to facilitate its commercialization.