HISTORY

In 1994 this campus submitted to the Commission its preliminary plan (Appendix A) for campus academic assessment.  This was subsequently accepted and approved (Appendix B). 

In that original plan, academic program (in contrast to liberal education) assessment was largely assigned to individual departments/collegiate units with reporting lines to our Office of the Vice Chancellor for Academic Administration (VCAA).  This still remains as the operational process for the campus.

This preliminary plan also addressed our liberal education curriculum.  As it has evolved since 1994, a significant responsibility for the design of liberal education assessment is vested with the campus’ Liberal Education Policy Subcommittee (LEPC) composed of faculty (drawn from all collegiate units), students, and administration.  Its actions are subject to review and ratification by the campus Educational Policy Committee whose actions, in turn, are finally approved by the Campus Assembly.  Implementation of liberal education assessment is assigned to the VCAA, and it is through this office that data collection and analyses are performed.

During the last comprehensive evaluation (1997) of this campus, the team spent considerable time examining the degree and extent to which our 1994 assessment plan had been implemented.  It was clear to the consultant-evaluators, as it was also to us, that while many significant steps had been taken, actual data collection was lacking and that it was time to restructure what had been previously proposed.  Among other things, the team recommended that “……a progress report be submitted to NCA in September, 2000 documenting the implementation of the institutional assessment plan in all disciplines and indicating the curricular modifications that have been made.”

A perturbation to all of our academic programs, including our liberal education curriculum, was a system-wide conversion from quarters to semesters effective fall 1999.  Although valiant efforts were made during 1999-00 on assessment, it was apparent that no purpose would be served by looking at quarter programs that no longer existed and that it was still too early to study, with any degree of thoroughness, our semester programs.  Therefore this campus requested and received from the Commission a one-year extension in its submission of an assessment implementation report (Appendix C).

We are now pleased to submit this report.  We believe that this campus has embraced, to a great extent, the basic philosophy of academic assessment.  Furthermore, we believe that steps have now been put into place that will create a campus environment that will recognize the importance of these endeavors and how they ultimately will benefit our students and academic programs.

The first principle guiding UMD’s overall assessment plans was that:  (1) assessment must be based on the stated objectives (as articulated in the enabling proposals) of courses, individual academic programs, and the general liberal program; and (2) discipline and/or program administrators needed to determine the outcomes they expected and then develop assessment plans that measured success in reaching these objectives.  In some cases, it was necessary for departments, program members, and faculty committees to define exactly what they expected graduates to do.  Either this had not been done previously and/or there wasn’t agreement on objectives.  Above all, faculty needed to define the outcomes.  When and if those outcomes did not match administration understandings, discussions had to occur leading to common expectations.

A second principle was that multiple methods of assessment should be used to evaluate success and identify recommended changes.  It was strongly believed that no single method of assessment gives valid data for guiding program and course development. 

A third principle stressed that program assessment should guide revisions at both course and program levels.  Data should not be ignored and should provide significant arguments for indicated revisions.

 

UMD ACADEMIC PROGRAM ASSESSMENT

BACKGROUND

As required for accreditation, the Duluth campus of the University of Minnesota Duluth initiated a formal program review and assessment process.  In establishing a systematic program assessment plan, it was strongly believed by administration that program evaluation would only succeed if faculty defined the processes for the planned evaluation.  Prior to the initiative, undergraduate program assessment was "spotty" and irregular.  Many programs worked with employers and graduates to determine the effectiveness and appropriateness of programs, but processes and feedback were inconsistently reviewed.  Other programs relied on accreditation guidelines and/or faculty-administrator initiatives to direct and assess program effectiveness.

In developing a campus-wide plan for academic program assessment, departments were first asked to confirm agreement on the expected outcomes.  This question generated discussion in some departments.  Some departments agreed that they were educating solely as preparation for graduate school.  Professional programs were clearly educating for an employment market.  The rest of the programs recognized that they were preparing students for both options, and thus developed an assessment plan to measure effectiveness in both endeavors. 

Departments crafted assessment plans that would evaluate student success according to program objectives.  Discussions focused on the assessment measures that would be appropriate and would maximize the validity of the assessments with regard to the department objectives.  Departments were universally concerned about the time and expense involved with program assessment, and this concern affected some plans.  Departments were encouraged to use several types of assessment, especially if there were several expected outcomes. The results of these discussions resulted in the matrix (Appendix D) that appeared in our 1994 assessment plan. 

 

IMPLEMENTATION OF THE SELF-ASSESSMENT PLAN

When this campus converted to semesters in fall 1999, assessments of previously existing quarter programs were limited in their significance, but helped some departments to make program adjustments related to the semester conversion.  During the past academic year (2000-01), the VCAA office contacted all departments requesting answers to specific questions about whether assessments are being done and to what extent the resulting information is being used in program development and modification (Appendix E). 

The survey of departmental implementation of assessment plans has shown, among other things, that (1) more consistent follow-up of assessment is needed, and (2) departments should perhaps be held more accountable for assessment results when proposing changes in program content and/or structure.  The feedback from departments indicated that nearly all departments and programs implemented their assessment plans, but without administrative follow-up, and because of frequent administrative turnover at the department head level, assessment will be inconsistent without incentives.  Incentives can include reports of results to the dean and VCAA, use of assessment results in curriculum change proposals, and consideration of assessment information in budgetary discussions.

As noted earlier, departments/programs were required to identify the assessment methods that were most appropriate for their program objectives (Appendix D).  In spite of the perturbing influence of semester conversion, some assessment processes were implemented as part of the semester conversion process; other assessment activities were delayed until 2000-01 or until students had enough semester experience to respond and give credible information from the assessment processes.

The memo previously mentioned (Appendix E) asked several questions in order to: (1) verify that program assessment was actually being done by departments/programs; (2) verify that appropriate consideration was being given to assessment data; and (3) determine what needed to be done to continue, and to promote, valid program assessment in the future. 

A VCAA review of the program responses to the memo prompted follow-up requests for information.  While departments generally responded positively to the questions that were asked, there was significant variation in the amount of information provided. The department and/or program responses will be used to determine how the campus needs to manage and promote effective program assessment in the future.

 

EXTERNAL TEAM PROGRAM REVIEWS 

In 1998 there was an administrative decision to implement a systematic review of all undergraduate programs by means of external review teams.  The policy leading to that decision (modified in October, 2000) is found in Appendix F.  It should be noted, however, that those undergraduate programs in departments also having graduate programs had been undergoing reviews on an approximate seven-year cycle. 

Consultants from comparable institutions are invited to the UMD campus to evaluate the effectiveness and efficiency of program offerings.  While such on-site evaluations can be costly, it is believed that program faculty and campus administration can quickly get global information about program direction and effectiveness through this mechanism.  On-site evaluations encourage departmental program assessment and produce information that is useful in making time-sensitive decisions on budget and program direction.  By the end of the 2001-2002 academic year, all departments will have undergone a recent program review.  The schedule showing both undergraduate and graduate program reviews is shown in Appendix G.  External reviews are considered important and successful enough that from this point on all departments and their programs will undergo on-site reviews every seven years.  In some cases this will be part of a program’s professional accreditation visit.

All departments that previously had not been included when analogous graduate programs were reviewed were scheduled for on-site evaluations over a three-year period.  The protocol for the external review of undergraduate programs was modeled after the protocol used by the University of Minnesota Graduate School.  The VCAA manages the external review of programs with the cooperation of the UMD Associate Graduate School Dean when a department also has a graduate program.  The focus of the reviews is “program improvement” by identifying what is being done well and suggesting ways of improving program effectiveness. 

Prior to a review, the department prepares a self-study document according to guidelines provided it (Appendix H).  This self-study report is sent to the external reviewers in advance of their visit as well as an enumeration of topics upon which they should focus during their visit (Appendix I).

External review teams are nominated by the program faculty and the college dean with final selection by the office of the Vice Chancellor for Academic Administration (VCAA).  When a graduate program is also to be reviewed, consultation is made with the UMD Associate Graduate School Dean in the selection of the team. For an undergraduate review, the external team consists of three members.  If both undergraduate and graduate programs are to be considered, a team of at least four individuals is selected.  The department contacts candidates for the site team to confirm their willingness and ability to participate in the review; if willing and able, the candidates forward a current vitae for administrative review.  The review team is usually composed of members who are recognized for their leadership in the discipline, who come from other regions of the country, and who originate from, or understand, schools of our size and mission.

The agenda for the site visits is structured to include opportunities for the team to meet and visit with representatives of the various groups involved with or affected by the program.  A typical agenda is shown in Appendix J.  The visit usually starts with a meeting of the team, the college dean, and the VCAA.  When a graduate program is involved, the UMD Associate Graduate School Dean is also included.  The objective of this meeting is to clarify the intent and purposes of the review.  It is emphasized with the team that the intent of the sight visit is program improvement; the review is not for “accreditation” or to boost the image of the program nor is it for the purpose of highlighting a body of negative or derogatory information.  Most site teams have recognized these distinctions and have honored our intent.

At the end of the visit, the site team has a meeting with the Chancellor of UMD, the VCAA, and the dean of the college in which the program resides.  Again if a graduate program is also being reviewed, the Associate Graduate School Dean is also present.  The team reviews its general findings and recommendations at the meeting and submits a written report within four weeks of the visit.  Once the report is received, the department has an opportunity to review the report and offer comments, corrections, and/or concerns about the report. Campus administration reviews both documents, and the VCAA responds with a statement of action. 


ACTION STEPS

Experience with the various program assessment activities on campus is prompting, and has already caused, changes in methods of assessment, implementation of assessment protocols, and analysis of assessment results.  Each method of assessment in our original plan has had weaknesses.  Some of our planned changes are listed below.

Program Self-Assessment

1.    Several programs have found that assessment methods they had planned to use are too cumbersome to implement and/or do not yield results/data valuable enough to merit the costs in time or resources.  As a result, they anticipate using other methods of assessment subject to approval by their dean and the VCAA.

2.    As noted previously, we need to establish more consistent administrative follow-up on implementation of program assessment programs to be sure that plans are executed and that data are used for program improvement.

3.    We should try to increase the accountability of programs in the use of assessment data when proposing course and curriculum changes.

4.    It will be helpful to review assessment plans of all groups to determine other methods that might be helpful and to be certain that program assessment plans are still appropriate for current program objectives.

External Program Reviews

The most important change to make in these reviews is to define ways for more consistent follow-up on team reports.  It is important to give programs the opportunity to respond to team reports; we now need to systematize what happens after departments respond.

Costs of these reviews have been "reasonable", but we should look for ways to economize without jeopardizing the quality of the reviews.  Cost was considered important in past reviews, but it did not limit the quality of this review process.


UMD LIBERAL EDUCATION ASSESSMENT

BACKGROUND

One of the cornerstones of the University of Minnesota Duluth undergraduate experience is the exposure to liberal education that students receive (see Appendix K).  In addition to providing breadth of knowledge, this program encourages critical and creative thinking; develops speaking and writing skills; provides practice in analytical study methods; examines basic values; encourages active citizenship and social responsibility; provides awareness of historical traditions, intellectual and artistic endeavors, global issues, and concerns in today's world; and diverse cultural values in the United States.

 

COURSE ASSESSMENT

All courses in the UMD Liberal Education program fall into one of ten categories.  Beginning in the fall of 2001, members of the UMD Liberal Education Policy subcommittee (LEPC) will begin a careful and systematic review of each course within the liberal education program to ensure the courses are in compliance with the goals and objectives of their respective categories.  Courses are slated for review on a rotating basis, each course coming up for review once every five years.

Assessment of individual courses will proceed on the following rotating schedule:

                 2001/2002 academic year

Category 1  Comp 1120—College Writing or its equivalent

Category 2  Math, Logic, and Critical Thinking

PE/Rec courses (although these courses are not category-specific, they are still part of the Lib Ed program and need to be assessed in the five-year cycle)

 

                 2002/2003 academic year

Category 3  Communication, Computer Science, and Foreign Languages 

Category 4  Physical and Biological Sciences with Lab

 

                 2003/2004 academic year

Category 5 Physical and Biological Sciences without Lab

Category 6  The Social Sciences

 

                


2004/2005 academic year

Category 7  Historical and Philosophical Foundations

Category 8  Contemporary Social Issues and Analysis

        

                 2005/2006 academic year

Category 9 Literary and Artistic Expression: Analysis and Criticism

Category 10 Literary and Artistic Expression: Performance

 

At this point, review of categories 1, 2, and PE/Rec courses will again make their way to the top of the rotation.  Beginning in 2006/2007, the cycle of evaluation will begin again.  

Because of the magnitude of this course-by-course review, the LEPC will be broken into small groups, with each group being assigned responsibility for reviewing a portion of each year’s courses for review.  The groups will be comprised of faculty intimately involved with the area under review, as well as faculty from other areas of study.   

Departments offering courses in the respective categories will be asked to provide representative examples of syllabi, exams, and guidelines for any required oral presentations and/or written assignments for the previous four semesters.  Using the materials provided, as well as the original course proposal, a small group from the LEPC will review each course.  Courses will be designated as consistent or inconsistent with the Liberal Education program’s overall and category specific goals (Appendix L).  If a course is found inconsistent, the department will be notified and the course is reviewed again the following year. If, at that time, it is still inconsistent with the goals, a revised course proposal will be requested.  In the instance that a course is eventually deemed inconsistent with the goals, the LEPC will recommend its removal from the Liberal Education program.

 

LIBERAL ASSESSMENT VIA FOCUS GROUPS

Whereas assessing the liberal education program on a course-by-course basis will provide us with an indication of the knowledge instructors are distributing to their students, assessment using student focus groups has also been implemented to gather data otherwise unobtainable.  The narrow and focused account gathered by the course-by-course assessment is contrasted nicely by the depth of the student focus groups (comprised of UMD juniors and seniors who have completed most or all of their liberal education credits at UMD), which will explore how the liberal education curriculum, taken as a whole, is doing its job. 

In spring 2001, focus group meetings were conducted on a collegiate basis so that any differences of opinion distributed among our five colleges could be obtained.  In order to create as neutral a situation as possible, facilitators for these focus groups were recruited from our Counseling Psychology graduate students.  Approximately ten students were invited to participate in each focus group for a total of 50 different students.

A set of standard questions (Appendix M) was developed by LEPC and reviewed with the facilitators prior to the group meetings.  These questions dealt with the overall structure and objectives of the liberal education program.

Unfortunately our experience last spring with these focus groups was not satisfactory.  Although students expressed an interest to participate, very few of them eventually appeared at the meetings.  Moreover for those students in attendance, they were very reluctant to make constructive comments and their interest in the entire process was minimal.  Consequently this fall (2001) we will be trying a different approach. 

All our students are required to take an advanced composition course.  This is typically done during the junior or senior year.  Actually we have several such courses, each one is geared to a specific college.  More specifically, we have identified specific sections in each of the five following courses:

Comp 3110 (Advanced Writing; Arts and Letters)

Comp 3121 (Advanced Writing; Business and Organizations)

Comp 3140 (Advanced Writing; Human Services)

Comp 3150 (Advanced Writing; Science)

Comp 3160 (Advanced Writing; Social Sciences)

Within the specific section for each course, one class period will be devoted to a consideration of the liberal education questions that we have developed (Appendix M).  The instructor will facilitate this.  Students will subsequently be asked to write a summarizing paper on this discussion.  Not only will this provide us with a written record – actually several as the class section will contain approximately 20 students – but, hopefully, the students will gain experience in translating spoken conversation into a written transcript.

We intend to use this information, presumably representative of a large cross-section of our students, in assessing the overall design and structure of our liberal education program.

 

ASSESSMENT VIA STUDENT SURVEYS

The final method by which UMD's liberal education program is being assessed consists of surveys distributed to a cross-section of seniors in their last semester.  A quantitative survey instrument assessing the entire liberal education curriculum was designed by LEPC and in spring 2001 given to a random sample of 500 final semester seniors assembled by our Office of Institutional Research.

To increase the response rate, the survey was distributed in two formats: traditional and electronic.  Participants received a letter and paper survey instrument via campus or US mail.  The cover letter explained the purpose of the survey and requested the participant's help.  Participants were also informed that the enclosed survey existed in an electronic format if they would rather participate that way.  Participants were given the URL to the secure site, as well as a password that allowed them to enter the site and fill out the survey.  Through this approach we attempted to maximize the return rate.

Copies of this letter and the survey, together with the results, are shown in Appendix N.  Our overall return rate was about 15%.  We did not see any significant difference in the response percentages between the written form and the electronic form.  The number of responses to each question (shown as “N” in Appendix N) and the mean response to each question ((shown as “Mean” in Appendix N) reflect the total of written plus electronic returns.

We will use this survey again this academic year, most likely next spring.  With those results, plus those we have already obtained, changes to our liberal education program will be explored.


 

SUMMARY

As a campus we feel quite pleased with the progress that we have made regarding student assessment of learning.  At the time of the last comprehensive visit (1997) there was no systematic review of all our academic programs.  Only those departments having graduate programs had undergone prior reviews.  That is now completely changed.  As previously indicated, by the end of 2001-02 all of our academic programs would have been reviewed by external reviewers.  Additionally we now have in place a policy and schedule whereby these reviews will be repeated on a periodic basis.

In 1997 assessment of our liberal education program existed only on paper.  Implementation had been non-existent.  Since that time we have revised our strategy and plan for such assessment.  Except for critical course review, which will commence fall 2001, the other components of this assessment were implemented during 2000-01.  Besides yielding important assessment information, from which future actions can be taken, we have also learned how improvements can be made in the procedures themselves.

Finally, and of equal or greater importance, is the realization by our faculty that assessment of academic programs and curricula is more than merely looking at grades received.  We feel that this is largely attributable to our involvement of faculty in the planning, the implementation, and in the analysis of the processes which we have employed.  Our faculty governance process participated in the approval and endorsement of all that is described in this report.  Without this support, we could not have achieved what we have done.

Overall we feel that the lessons that we have learned about designing and implementing measures for the assessment of student learning would have applicability to other institutions.  Towards that end we will be proposing a paper describing our experiences for presentation at the 2002 annual meeting of the Commission.