“SINGING FROM THE SAME SHEET OF MUSIC”
(Dr. John Fleckles, Address to Faculty at Convocation, Spring 2004)
The song I refer to is one that we should sing together with knowledge and in harmony about what HPU is as stated in our WASC reports – that is about our processes, evidence, and progress so that when our accreditation visitors arrive in March they are greeted by a faculty that is knowledgeable and informed about our preparations and case for compliance - but of course, we may not all agree. It is understood that we will sing with difference voices but the basic harmony has got to be there. So in order to sing, you need to know the sheet or music or - THE SCORE.
Rest easy – I am not going over all our WASC essays and evidence today.
The WASC Capacity review visit is scheduled for March 17 to 19th. Our essays and evidence making the case that we meet the capacity requirements of the new WASC standards was sent in on December 22nd. The essays were written by Nancy Hedlund and John Kearns, based on input mainly from groups that formed part of the Task Force on WASC readiness, formed in August 2002, and met as a whole group or as subgroup assigned specific standards or parts of standards. The WASC essays are available at Deans offices and OAA and Nursing, and evidence on campus pipeline – hold up the documents.
The WASC team is led by Dr. Jon Strauss, President of Harvey Mudd College and its members are Karen Yoshino, Director of Institutional Assessment at Occidental College, Jeffrey Bialek, VP and Chief Financial Officer at Golden Gate University, J. Frederick Volkwein, Director of the Cener for the Study of Higher Education at Penn State University; and Diana Wu, Professor of Business Administration at St. Marys College of California in Walnut Creek California. Also visiting will be a scholar in education administration from Redlands University, Dr. Mary Boyce, who will study the new process of accreditation and its impact on universitys undergoing review and also, of course, various WASC officials such as our institutional liaison person at WASC, Elizabeth Griego. The teams purpose while here is to ask questions, listen, and verify the report on capacity. Essentially the same team returns in March 2005 for the Educational Effectiveness Review. We will be making the schedule for the visit over the next few weeks, and forum in which you are a member may be involved. So be flexible with scheduling on March 17, 18 and 19th.
Today I will address a specific chorus – it is a limited aspect of accreditation but one which is a requirement, one in which you have much invested already and much responsibility and one in which we rely on you for demonstrations of compliance. That topic is ACADEMIC PROGRAM REVIEW.
There are other academic compliance issues in which you have been involved and on which we are in good standing: such as learning objectives – most on WEB; syllabus review, Faculty Hiring plan – referred to today; general education reform; and strategic educational effectiveness planning. Please inform yourselves of the progress we have made in these areas.
That said, here are the questions I will address this morning:
- First, where are we with academic program review? Do we demonstrate capacity with regard to processes and outcomes? Have we responded adequately to the WASC recommendations on program review made in 1999?
- Second, what progress have we made in linking academic program review to educational effectiveness planning?
- Third, going beyond this year’s visit, how prepared are we for demonstrating educational effectiveness for the report due to be sent in December 2004 for the educational effectiveness visit in March 2005?
1. Where are we with program review and do we demonstrate capacity?
The short answer: we have made considerable progress, certainly enough to demonstrate compliance, but we have not achieved critical mass in completed reviews (including external review and application of recommendations). And we still have programs in which the faculty are not as yet adequately engaged in review.
Here is some data on that:
- 89% - of the student body (5254) are in programs that have started reviews and have made some progress.
- We have a good record of starting up new program reviews, at the rate of 5 to 7 per year.
- On a positive note, program chairs have been named for all but the smallest programs.
- Also, we have developed our institutional capacity for assessment beyond program review. We will demonstrate that we have information about Assessment activities that have been carried out across the university that support academic program review – such as
- graduation surveys and
- more thorough academic data reports (since 1991) which are open and available on campus pipeline. These reports done once a term – provide data on
- enrollment by major,
- percentage use of adjunct faculty and faculty overloads by college and field,
- grade distribution reports,
- percentage of classes above and below the maximum size.
- These are available now as trends since 1991, are disaggregated by field and college, are available to all faculty and are available to and used by academic program review committees.
- Another assessment source that is new are the revised course evaluation forms that include a greater emphasis on student learning.
- In addition, for the past three years, workshops have been held with program review chairs, program chairs, and recently deans. Now called the Program Reviews Council, it gives this process institutional recognition, increases the link of program review to institutional strategic planning, and provides a forum for discussing the challenges and problems of review.
But we still have challenges:
- Only 45% (2673) are in programs that have completed reviews
- And some programs are not making any progress with academic program review at all – this is a serious problem – their inactivity can undermine the accreditation review.
- We are seriously behind in external reviews and need to accelerate these in 2004 – first in marine science in late January with Dr. Joel Thompson of Eckhert College. We are also looking at computer science, communication, and accounting external reviews in 2004 as well as others.
2. What progress have we made in linking academic program review with educational effectiveness planning?
Our answer in the essays is that we have made significant progress but the entire process is not in place yet.
Academic program review has been going on since the mid-90’s at HPU and was started before Educational Effectiveness planning was in place. Now, academic program review is being integrated into educational effectiveness planning.
How is that being done?
First, program reviews provide the evidence and statements of needs at the program level that provides input into planning. Specifically evidence about needs for faculty, curriculum reform, and reforms arising from conclusions about student learning as well as other factors such as staying current with developments in the field and in appropriate learning technology.
Second, the planning process provides the AVENUE for implementation of program review recommendations, especially when program review recommendations are linked to one or more of the educational effectiveness strategic priorities.
But we still have some challenges.
For one, how do we decide among competing valid needs and program review requests when we deal with limited resources and do not have as yet an institution-wide determination process.
Our immediate answer or new step taken is the establishment of The Academic Support Council, which was started up this year:
- The ASC serves as a forum for coordinating the requests coming forward from colleges and programs with the educational effectiveness planning strategic priorities and with the university’s resources.
- In this way we align resources, planning, and the academic program review recommendations.
- Another area in which we are making progress in planning is the Faculty Hiring Plan – one has been established by each college – and now we are developing the overall university faculty hiring plan. We are starting to demonstrate a capacity to organize priorities for faculty hire among competing needs.
- A related new development is also the Educational Effectiveness Planning Committee efforts to determine the characteristics of flag-ship programs.
- Both of these items – along with the Ed. Effectiveness Strategic priorities will assist the Academic Support Council in its preparation of recommendations to the President on expenditures.
- The point here: Academic program review based in the colleges and programs is a basic building block for the working on the Strategic priorities, the faculty hiring plan, the definition of learning resource needs, and the definition of flagship programs. …all of which are invoked when making the final decision to spend.
3. How prepared are we to demonstrate educational effectiveness for the review next year?
First part – or first verse you need to understand is: the processes of review and planning that we already practice or are developing are NOT what next year’s review is about. Those processes will be evaluated in the review this year. What will be reviewed next year are the results of our processes of review and planning: that is, have we completed assessments of our educational effectiveness and do we know the results as evidence, and have we instituted change as a result?
Right now we all need to sing from the same sheet of music in answer to this question by knowing that our case for educational effectiveness will be based on three themes: (which, thank goodness, are also among the planning strategic priorities so we have a track record of working on them):
- Promoting student learning – for which academic program review is a central activity and faculty responsibility.
- Enhance organizational effectiveness – by effectively linking faculty governance and administration
- Develop global citizenship.
We have much to show regarding progress in educational effectiveness with regard to all three.
Today I want to close by emphasizing our responsibility to
1) understand the basics about the organizational aspects of our case for compliance with regard to capacity on academic program review,
2) to congratulate those programs who have not only accomplished reviews but have been able to act on the results,
3) to urge those of you who lag to catch up, and
4) all to see where or how your program review efforts not only lead to improvement in your program but are a basic part of the institutions planning processes.
This year we demonstrate we have capacity – that is we have the processes (that cut across fields) and resources to support educational effectiveness. Next year, we demonstrate we have the results – in terms of accomplishment of themes ….that as an institution, we know through evidence that we accomplish educational effectiveness or know what to do if there are challenges with regard to that.