NCHEA provides an industry leading thought knowledge center for providing the necessary resources for assessment, on the parts of administrators, faculty, students, alumni and industry, by giving members the complete set of Higher Education Evaluation and Assessment resources in a single source to ensure that both direct and indirect assessment findings are mechanisms for continuously improving educational experiences.
Showing posts with label student learning outcomes. Show all posts
Showing posts with label student learning outcomes. Show all posts
Thursday, September 4, 2014
Tuesday, May 24, 2011
"Model of the Moment"
AEFIS Response:
Competency-based models in higher education cater to the fast-paced attitudes of many of today’s students to get a degree and get into the workforce, specifically part-time students. However, many criticize this model for its limits in providing students with learning experiences outside of their fields and technical skillsets, going so far as to say that the competency based model “is not a college education.”
How does a competency-based model compare to the implementation of a strong assessment plan with student learning outcomes? The difference is the application of data –in a competency model, students are awarded credit based on successful completion of competency tests after preparing individually without an instructor or structured course – while student learning outcomes performance data is collected in supplement to grades to understand student learning and achievement. The focus is removed from effective student learning in an effort to save customers, students, time and money. Institutional development shifts from curriculum design to test design to ensure that graduates have suitable credentials to obtain a degree. Are these methodologies separate, but equal?
Regular student interaction, whether in person or online, and structured instruction benefit students’ learning and opportunities for collaborative educational experiences. And the collection of assessment data over time benefits both the understanding of students’ development and institutional programmatic development as it relates to workforce professions over time. While competency-based models provide great convenience to part-time or returning students, they do not offer “18-year-old [freshmen] or 20-year-old community college student[s the opportunities] to really do well and get a degree.” And convenience is not the vehicle for a strong education.
Measuring competencies or student learning outcomes in addition to grading traditional assignments demonstrates to students what has been learned over the course of curricula and provides mechanisms for practicing necessary skillsets.
Becky Joyce, AEFIS Team
Thursday, April 21, 2011
Assessing our Assessing
At our recent ABET Symposium workshop, we asked participants to self-assess their programs' assessment practices. Similarly to any classroom setting, there were a few partakers who were quick to share their institutional efforts and several who shied away from the questions. The group managed to come up with some interesting topics of discussion and open ended questions for their faculty and administrators.
Much of the room admitted that their curriculum mappings and student assessments are planned and warehoused on paper. For most of them, this results in boxes and boxes of hard copies and hundreds to thousands of man-hours for the preparation of an ABET accreditation visit. Although the room had representatives from many different schools, several individuals described analysis packages developed for assessment data. There was no conversation of shared tools or practices. The conference, itself, is meant to be a forum for cross-pollination of ideas and best practices, however, the schools remain trapped in the silo effect. And so we introduced AEFIS to get the conversation forward thinking.
Shifting the conversation from: how data is collected and stored – the logistics of assessment – to the real meat and potatoes of it: what data should we be collecting and how should we use it to improve student learning, got the audience more engaged!
The outcomes that ABET expects from students at the completion of degree programs can be difficult to assess and report on, especially if there is little or unsustainable infrastructure in place. Development of assessments can be trial and error based and begs for collaboration. So we dove right into some assessment activity questions:
How does your institution assess ethical components to report on outcomes/objectives?
- Assessment Measures:
- Scenario based test questions
- Developed case studies
- Field exercise interviews
- Scenario based test questions
- …How is student performance rated / documented?
- Against a rubric
- Against a rubric
- …How often is such a rubric reviewed / adapted?
- Rarely
- Rarely
How does your institution report on Program Educational Objectives?
- Assessment Measures:
- Student certifications post-graduation
- Student certifications post-graduation
- …Does this demonstrate success or student learning?
- Open for discussion
- Open for discussion
- …How can we increase our response rates for alumni surveys?
- Open for discussion
- Open for discussion
These and more questions are being posed by institutions as they plan their assessment efforts. And these questions only started the process of assessing our assessing.
We encourage you to review the questions that workshop participants considered. Download a copy of the workshop materials at our website. Please feel free to share your ideas and comments or let us know what questions we should be asking!
Subscribe to:
Posts (Atom)