Welcome to the Institutional Student Retention Assessment (ISRA) System. We hope this assessment will help your institution deal with the complexities of student retention in your department, on your campus, or across the system. The ISRA is a special tool to help coordinate retention planning as it pertains to student-centered initiatives. The information that follows should answer your questions about the ISRA and how it can help your institution. |
According to Swail (2003), there are four phases of retention programming at the postsecondary level. Phase I is the pre-planning stage, involving the collection of information about the campus. Phase II, planning, is the preparation of a plan to administer on campus. Phase III is the implementation of that plan, and Phase IV involves the monitoring of the impact of changes on campus. The ISRA is an important tool for Phase I, the pre-planning phase. At this stage, institutions must audit their institution to determine what the specific challenges are regarding student retention, persistence, and success. Similarly, the institution must identify the practices and strategies currently in use and determine their level of effectiveness. And finally, the institution must also identify the structures and resources available to make change happen. |
With these issues in mind, we developed the ISRA to help guide institution through this complex process. We firmly believe that most institutions do most of the right things to engage students and create a successful climate on campus (even if that campus is virtual). Where institutions fall short is in the following areas: first, they don’t always know what they are doing for students. Because the campus is a large organism, we are not always cognizant of the many strategies in use at any particular time, and this is problematic when trying to improve the institution. Second, we often don’t know how successful these strategies are in practice. They all make sense in theory, but do they work for your students? Third, do these strategies get to the students that need the support? Just because the institution “does it,” certainly does not mean that students with the greatest need “get it.” |
To clarify, institutions must identify the challenges, document the current solutions, measure their effectiveness, and then devise plans for improving the entire system. There are other considerations for the second phase – planning, where institutions must determine the resources available to make change, but that’s for phase II, and our focus in the ISRA is Phase I – pre planning. |
We hope you find the ISRA a useful tool for your institution. The ISRA took three years to develop by the Educational Policy Institute with support by Lumina Foundation for Education. We are dedicated to improving the ISRA over time, so that it does not remain a static program, but rather, an evolving instrument with evolving features for institutions. Just as institutions of higher education change, so must our strategies and tools for keeping current, responsive, and meaningful. |
Please feel free to contact us at info@educationalpolicy.org or (757) 271-6380 if you have any questions or comments. |
Developed by the Educational Policy Institute with the financial support of Lumina Foundation for Education, the Institutional Student Retention Assessment (ISRA) is a web-based self-assessment for institutions of higher education. The ISRA is intended to help the institution assess its status with regard to serving students and, ultimately, keeping undergraduate students on course to degree and curbing the dropout dilemma encountered by many of our postsecondary institutions. The ISRA queries campus-based stakeholders about their current use of resources, retention strategies and programs, institution-wide characteristics, and policies and practices in the following areas:
- Institutional Context
- Recruitment & Admissions
- Financial Aid
- Student Services
- Academic Services
- Teaching & Learning
By entering this information into a web-based system, stakeholders will participate in a process which produces a report illustrating their institution through the lens of student retention. An institution committed to student success can facilitate its mission by employing the ISRA as part of a wider effort to identify its particular strengths and weaknesses for purposes of continuous improvement. For this reason, the ISRA contains numerous items that ask an institution to rate its own performance or inventory its own policies and practices as honestly as it can.
The ISRA is based with the US-system of higher education as a primary target (e.g., two-year, four-year, and proprietary institutions), but with a keen eye that the system can be used effectively by institutions in Canada and beyond. Some of the language and nomenclature will be different, but institutions should be able to use the tool effectively.
Institutions that are truly interested in improving their services to students and, as a result, increasing student retention and persistence, can use the ISRA to guide them through the process of researching, planning, and institutionalizing retention-focused programming and strategies on campus. The ISRA is not a silver-bullet, sure-thing tool to solve the student retention puzzle on campus. Rather, it is a tool that can help guide the institution through the steps necessary to make change on campus. The ISRA is less about content and more about process. If institutional teams are mindful of that statement, they will be much better prepared to get something meaningful out of the experience. While content is certainly important in acknowledging the status of the institution, it is the team building, staging of questions, and acknowledgement of where the institution is and determination of what it wants to be that matters in the end. Thus, a simple forewarning for institutions that can’t get beyond the wording or semantics of the ISRA: don’t start. You’ll simply waste your time and that of participating faculty and staff.
We acknowledge that the ISRA is not a perfect inventory. There are surely many things left out or some things that are phrased in a way that does not make perfect sense to a particular institution, department, or individual. The ISRA can’t be all things to all people or institutions. But the ISRA, when taken as intended, can be useful to institutions as a foundation for their internal processes.
In the end, it is our hope that the ISRA will help institutions by forcing them to think, in teams, about what works and what doesn’t at their institution. These pieces of information will then form the platform for moving forward toward the planning phase of the strategic process.
ISRA is composed of six discrete sections: Institutional Context; Recruitment and Admissions; Financial Aid; Student Services; Academic Services; and Curriculum, Teaching, and Learning. With the exception of the Institutional Context section, each section opens with a Strategic Framework section, which poses questions on the mission statement, goals and objectives, policy and practice, and evaluation and assessment for this area. The Strategic Framework section is followed by sections on specific program areas within the larger unit (e.g., Counseling Services within the Student Services section). The program sections include questions on perceived strengths and weaknesses, as well as on specific strategies and practices within that program area.
The concept for ISRA is based largely on research conducted by Swail, Redd, and Perna (2003). Swail’s geometric model of student persistence and achievement acknowledges that student success depends on the interaction of the student and the institution, and more specifically, how the institution understands and reacts to the cognitive and social attributes of the individual.
Figure 1. Swail’s Geometric Model of Student Persistence and Achievement (Swail, Redd, and Perna, 2003)
The framework was originally developed to better understand what it takes for students of color in STEM areas (science, technology, engineering, and mathematics) to succeed in higher education. Over the course of a decade, the model evolved after understanding that the central tenets of the model could improve success for all students, knowing that students are diverse in a number of ways, and not just by the color of their skin. Thus, the framework has evolved to its current design.
Using these theoretical pieces as a foundation, ISRA was developed to identify key factors on campus that support or detract from student retention. ISRA, in effect, seeks information about institutional support services currently in operation and issues affecting students who seem to fall through the cracks of the system. ISRA builds a model of goals, objectives, and related strategies for institutions and helps determine where the strengths and weaknesses are in the institution’s approach.
Any institution that thinks they can benefit from the ISRA can register and use the instrument for free until January 2008. At that time, a licensing fee will be administered to help sustain the evolution of the ISRA.
There are two general types of reports that can be generated by the ISRA. The first is general report for each of the six sections. This report produces a pdf of all information entered into the ISRA.
A second, summary report can also be produced for each section, summarizing the main components and using an analytical procedure to determine areas that an institution may wish to focus on for institutional improvement.
Founded in 2002, Educational Policy Institute (EPI) is an international, non-profit association of researchers and policy analysts focused on studying the academic preparation for, access to, and success through postsecondary education. Because of the increasingly competitive international economic environment, EPI is also committed to research which measures and improves the quality of education these students receive.
As part of its mission, EPI operates studentretention.org, a research-based center designed to study issues and disseminate information to college administrators, faculty, and other stakeholders regarding student retention and persistence. In addition to ISRA, studentretention.org includes a number of useful services to the postsecondary community, including the development of a peer-reviewed, web-based “EFFECTIVE PRACTICES” database, a regular newsletter (Student Success) that updates subscribers on retention issues and resources, an annual survey of campus professionals, the Annual Student Retention Awards program, regional and national workshops and conferences on student retention, research projects, and research-based services. Taken together, these programs and services provide administrators and practitioners with useful, hands-on information to help them improve student retention and persistence on their campuses.
For more information about EPI, please visit www.educationalpolicy.org or www.studentretention.org.
Student success is everybody’s business and is ultimately about change management on campus. Our knowledge of student retention and practices on campus underscore our belief that it takes a campus to make positive change for students. Thus, it is critical to involve representatives from across the campus in planning and implementing student success programming.
An institution can determine its best strategy for completing the ISRA. However, we believe that the ISRA is best conducted through a steering committee formed by the institution’s senior leader, such as the president or his/her designate. This cross-institutional team may include approximately 6 to 10 individuals representing various offices and academic divisions to complete the assessment on a collaborative basis. While team members will vary from institution to institution, participants may include:
- IPEDS coordinator/director of institutional research
- Directors of academic affairs, academic services, admissions, financial aid, and student services
- Representatives from campus constituent groups, including students, staff, and faculty
The ISRA is designed as a comprehensive planning tool. Once teams review the ISRA for what it is, they will clearly understand that it will require the collection of information and the discussion of topics by various campus departments and groups to effectively answer the questions.
We estimate that, if done to completion, the ISRA will take between 1 and 6 months. It is unlikely to take less time, and could take more depending on the level of depth that the institution would like to address. Remember, this is a “process” as much as anything else.
Some institutions may not want to conduct the entire tool, focusing on key sections (e.g., financial aid), or only focusing on key faculties or colleges (e.g., Engineering). That obviously would reduce the time for completion, but institutions must decide what it is they want to accomplish and use the ISRA as necessary to help reach their goals.
Prior to conducting the work of the assessment, team members will benefit from learning more about the realities of, and strategies for, student retention. Direct team members to review the handbook, Retention 101, which is available as a free resource on this website.
The ISRA is a planning tool, not a survey instrument. As such, the conduct of the ISRA should be integrated into your institutional planning processes as much as possible. To initiate discussion within your institution, we have suggested a process for conducting the ISRA, but it is imperative to define a process that works within your own institutional context.
Following the convening of your cross-institutional team, it is recommended that the team review the ISRA introduction and instructions and decide upon which (if not all) sections it would like to complete and in what order. The ISRA is designed as a series of six discrete sections (Institutional Context; Recruitment & Admissions; Financial Aid; Student Services; Academic Services; Curriculum, Teaching & Learning), and sections may be completed in any order. We recommend that your team consider one of the following options:
- Single-Section Assessment: If your institution has addressed the issue of student retention in its strategic planning and has already identified a certain area of interest for further study (e.g. Financial Aid), then select and complete the appropriate section of the assessment only.
- Partial Assessment: If your institution has somewhat addressed the issue of student retention in its strategic planning and has already identified a few areas of interest for further study, then select and complete appropriate sections of interest only.
- Full Assessment: If your institution has not addressed in its planning the issue of student retention in the last five years, it is recommended that the entire assessment be conducted.
It is critical to respond to the assessment in the most informed manner possible. For this reason, the team should identify available sources of evidence for the assessment, including feedback from affected constituencies on campus, such as students, staff, and faculty, which may be in the form of survey data; policies and procedures documents; a copy of the institution’s IPEDS submission; program review data; etc. If such sources are lacking, your team should work with your institution’s IR unit to collect the necessary data. It is recommended that the team commit at least one meeting to reviewing the assessment questions and familiarizing themselves with the data sources relevant to the assessment section being conducted before beginning the assessment process.
It is strongly recommended that you print each assessment section and use it as a discussion agenda with your team prior to entering data and information. Each assessment section begins with global questions concerning the strategic framework (mission, goals and objectives, policy and practice, and assessment and evaluation) of this institutional area. Since the area under consideration (e.g. Student Services) may represent the activities of several departments and/or units on your campus, take the time in your team discussions to consider their shared mission and goals, for example. These strategic framework questions are followed by a list of components of the area under consideration. For instance, the Student Services section includes the following components: Housing and Residential Life; Commuter Student Services; Counseling Services; Campus Activities and Climate; and Health. Each component is assessed using a similar format. Questions concerning the strengths and weaknesses of this component are asked, followed by inventory worksheets of best practices for this component.
The inventory worksheets employ Likert scales and ask team members to rate the degree of implementation of selected best practices on their campus. It is critical that responses to these worksheets are based on empirical or institutional evidence and not solely on the subjective opinions of team members. It is also important that the entire team discuss and agree upon responses to these worksheets, even if an individual team member has been charged with the responsibility of initially responding to the question, to ensure as bias-free a process as is possible. The worksheets also ask for narrative descriptions of the institution’s practice, evidence to support the rating of implementation, and a rating of how this practice contributes to student success. The team will also consider if this institutional practice requires improvement and, if yes, what could be done to improve its effectiveness. This narrative information will assist the team in prioritizing the strengths and weaknesses of this component in the final step of the section assessment process.
After the team has completed a section, a summary report of the section assessment will be provided. This report will include all the narrative responses to questions as well as responses to rating questions. Responses to rating questions will also be averaged so that team members can see the average rating of a program component and for the entire section at a glance. Based on this report, team members can now complete the final step in the section assessment, which is to review the report and determine the major strengths, weaknesses, and opportunities for improvement (SWO) for each program component and the section as a whole.
Completing the assessment is only the beginning of what should become a process of institutional change. While the exact process is highly dependent upon your institutional context, the team should produce recommendations based on the report for submission to the institution’s senior leadership. One institution consulted for this assessment suggested that, following the preparation of recommendations, the provost would ask the team chair to conduct a series of presentations to various campus constituencies, including academic deans, the student government, etc., in order to build campus-wide support for this focus on student retention. Each institution must find their own unique approach to using the report findings to facilitate institutional change.
When you first access ISRA, you will be asked to create a login name and password and to enter some basic information about your institution. This procedure establishes your institutional file in ISRA. When you log out of ISRA, the data you have entered will be saved in your institutional file.
It is difficult to respond to some best practices statements. For example, look at the statement, “We provide useful financial literacy classes and counseling of sufficient quality to engage and enlighten students in these areas” in the Financial Aid section. Yes, we provide financial literacy classes, but we don’t believe the classes are “of sufficient quality to engage and enlighten students.” We do not, however, offer financial literacy counseling. How would we score this statement when we must answer both “yes” and “no?”
Many of the best practices listed in ISRA contain multiple conditions for success. In the case provided, it is not necessarily a “best practice” to offer financial literacy classes and counseling. It is a best practice to offer useful classes and counseling of sufficient quality to engage and enlighten students in these areas. It is challenging to respond to these statements that list multiple conditions for success. The point of rating this statement and responding to the supplementary questions is to consider the conditions necessary for success, and compare this best practice to what is currently in place on your campus. If your current practice meets all conditions for success, the rating should be “5.” If your current practice meets some, but not all, of the conditions for success, a rating of “3” might be more appropriate, etc
Most importantly, do not let team discussions get stalled over issues of semantics. The ISRA was not designed as a survey instrument. It was designed to lead an institutional team through a reflective process of examining current policies and practices and comparing these current practices to research-based best practices in student retention with the end result of producing a plan for improving student retention on your campus. The written responses to the ISRA are not as important as the process of responding to ISRA questions.
The Educational Policy Institute provides technical support for ISRA. To request technical support, use the Contact function in the ISRA, email info@educationalpolicy.org, or call 757-271-6380.
Feedback is absolutely welcomed and strongly encouraged. It is fully anticipated that ISRA will change and evolve over time as more scholarly research is conducted on student retention and success. Responses from practitioners are also critical to shaping the continued development of this tool. ISRA users are encouraged to provide feedback through the Contact form provided on the website. Also, if your institution believes that it employs a best practice in student retention on its campus, please submit information on this practice to our Effective Practices database. Submission information is provided on the ISRA website.