What institutional research can do to support the individual academic

Margaret O’Flanagan*
Royal Institute of the Architects of Ireland
E-mail: moflanagan@riai.ie

Printer friendly version in PDF

*Institutional Analysis and Awards Officer, Registry, Dublin City University, at the time of writing

KEYWORDS: Institutional Research, Quantitative Approaches, Educational Development, Higher Education, Ireland, Academic Teaching

Introduction

All academic practitioners need information to help them decide what to do next from time to time. Information sources vary according to the issue at hand. In terms of learning development, reflective practitioners have numerous potential resources available to help them to understand the factors affecting student learning. Many qualitative methods exist to evaluate the learning process and experience (e.g. Light and Cox 2001). While qualitative approaches provide an excellent and rich resource, it is important to consider all resources, including those often dismissively referred to as ‘bean counting’, or more accurately, quantitative methods.

Quantitative methods are a central plank of the practice of Institutional Research and provide information that can speed up assessment of an issue while also providing context. Starting off with quantitative analysis can support the speedy identification of the existence, or otherwise, of patterns; leaving more resources and time available to explore and understand phenomena. While quantitative methods may not always support detailed insight into the working of new learning methodologies, they can shed light on the impact they have on students, and the intersecting impact of other factors operating to influence the experience of a given cohort.

What is Institutional Research?

Institutional Research is the practice whereby an institution assesses itself, its activities and its position within a given milieu. Higher Education Institutional Research facilities, where they exist, conduct these assessments with the objective of serving ‘as a comprehensive resource for information about the institution’ (University of Florida, Office of Institutional Research, Mission Statement). The data resources employed usually comprise information derived from surveys, student record and other internal record systems, sectoral and national databases and reports and published research. The actual assessments, analyses and hypotheses tested cover issues requiring ongoing monitoring as well as the exploration of emerging issues to inform an institution’s decision-making with regard to its own development.

Institutional Research is a relatively new concept in the Irish context in particular, and development of the practice is so far limited and very uneven. The type of work done by Institutional Research Units in other countries is, in Ireland, generally done across a number of disparate units and services. Some elements of Institutional Research are not currently undertaken, or are not easily available, in most Irish institutions.

Institutional Research and Formal Evaluation in Higher Education

The term “Institutional Research” may be more familiar to academics and education professionals in North America and Australia than it is to those in Ireland and some parts of Europe. Where Institutional Research is well-developed, and even in Ireland where the practice is only now emerging in a recognisable form, common databases and comparative analyses based on shared methodologies allow institutions to compare themselves, or benchmark, against other institutions and agreed standards (e.g. US Common Data Set; UK Higher Education Statistics Agency (HESA), Irish Higher Education Authority (HEA)).

A good starting point for developing an understanding of the scope of Institutional Research is the US based Association for Institutional Research (AIR). AIR hosts a website ( http://www.airweb.org) which reflects a vibrant and creative network of Institutional Researchers. The association has been incorporated since 1965 and has over 3,000 US based members as well as international affiliated associations including:

The Centre for the Study of Higher Education (CSHE) at The University of Melbourne, while not restricted to Institutional Research, is an invaluable resource for the tools and principles available for use in the study of Higher Education, be that within an institution or on a much broader basis. Operating now for 35 years the CSHE undertakes work within the University of Melbourne as well as nationally and internationally. The Centre’s website ( http://www.cshe.unimelb.edu.au/) includes a section on academic development outlining services ‘available on request to assist departments in the evaluation, review and planning of strategies to improve the quality of teaching and learning’ which form an excellent basis on which to develop a local strategy for evaluation and development.

In recent years the international drive to improve quality and assess quality improvements in a manner supporting transparent comparison and benchmarking across national Higher Education sectors has been one trigger for improved Institutional Research. The old paradigm whereby it was believed that Universities were the only bodies capable of assessing University activities has undergone a seismic shift in recent decades. Now, Higher Education institutions all over the world, including Universities, are required to open themselves to performance evaluation. Like the UK in the 1980s (Johnes and Taylor 1990b) Ireland, particularly since the implementation of the Universities Act 1997, and in line with an increased demand for accountability in all publicly funded bodies, is moving towards a situation similar to the UK’s ‘evaluative state’ (Henkel 1991).

While Irish quality review systems are based on peer review of a self-assessment, and formal league tables have been avoided thus far, external review is becoming more common with the EUA review of Irish Universities and the recently published OECD report (OECD 2004) while the print media in particular are moving towards the generation of informal league tables such as the Sunday Times University of the Year Award. The Irish Higher Education Authority has also published numerous reports on various aspects of the University system in particular (e.g. Morgan et al. 2001Skilbeck 2001) in addition to regular reporting on First Destinations and Annual Reports in paper and electronic formats.

Institutional Research and the individual academic

Our focus here is local information, what the Institutional Research resource or it’s equivalent in your institution, can provide in terms of data and analyses to support your teaching and development. Institutional Research offices do not always exist, in some cases there is no such resource, in others there is a full or partial resource which can be located within a variety of structures including the student records section or its equivalent, the President’s Office, Registry or Learning Development facility in centralised institutional structures. In Dublin City University’s case the ‘Institutional Analysis Office’ ( http://www.dcu.ie/registry/emao/index.shtml) is currently located within the Registry’s Awards’ Team, although it has developed through numerous identities and continues to do so, reflecting the early evolutionary nature of this activity at present.

Analyses of data drawn from electronic student records systems and surveying, while not comprising the entire gamut of Institutional Research resources and techniques, are the most obviously applicable to the needs of academics seeking to better understand student learning. Individual academics, as well as departments, have always carried out analyses using available data to assess student performance. The differences with the current situation; with the growing interest in Institutional Research, the advent of integrated computerised records systems and the emphasis on reliable data production methods for quality review in particular, are:

This piece does not recommend or promote reliance on empirical statistical analyses alone to inform understanding and promote learning development. Rather, it suggests that the considerable body of data and analyses developed for Institutional Research purposes represent an invaluable resource for academic practitioners seeking context and information to support individual understanding and decision-making.

Institutional Research; based as it is on expert knowledge of available data, the skills to manipulate that data for targeted analyses and the use of student record data in particular, can provide insights on visible and invisible characteristics in increasingly large and diverse student cohorts without the need to carry out surveys to test hypotheses. In essence, from the individual academic’s perspective, Institutional Research can often provide a shortcut to a level of initial understanding, releasing time and resources for well-founded qualitative investigation.

Institutional Research and You

Introduction

This section outlines the types of questions Institutional Research facilities can address to support the individual academic as well as providing an example of an Institutional Research project and examples of the queries often made to the Institutional Analysis Office in Dublin City University.

The most commonly requested analyses in DCU, where a substantial number of staff have engaged with Institutional Research as an additional tool available to support their decision-making, include:

Many of these types of queries subsequently lead to requests for comparative information. In general, once the internal situation is understood there is a desire to deepen understanding by comparing local activities with practice elsewhere within the institution, or in a similar discipline externally. In Ireland in particular, self-assessment and external assessment have become more widespread. Benchmarking, however, has not.

Questions Institutional Research May Help You to Answer

From the academic’s perspective, one of the greatest benefits of an Institutional Research facility is knowledge of the information available and knowledge of other analyses underway. Having a central resource means that individual academics do not need to ‘reinvent the wheel’ when an issue requiring investigation arises. Those in the Institutional Research facility will probably have done a similar analysis in the past and be in a position to undertake the desired piece of research quickly and efficiently. Knowing what information is available is equally as important as the ability to manipulate it. Johnes and Taylor (1990a) found, when developing an indicator for standard of degrees awarded across institutions, that the indicator had to be tempered with explanatory variables, some of which related to student characteristics. These explanatory variables, which were used to develop an expected value against which the actual value could be rated, included A level scores and the proportion of the student cohort living at home as well as library expenditure among the six items used. While library expenditure is not a student related variable, it is also not one that each individual academic might be expected to include in an assessment aimed at explaining why the standard of awards made in their own institution differ to those made to students in the same discipline elsewhere. The same is true for the impact of the proportion of a student cohort living at home.

The following questions and possible analyses illustrate two sample queries likely to benefit from the support of Institutional Research:


Question: Failure rates have risen dramatically in one of my modules, but I have not changed my methods and I can’t see why this has happened.
Possible analyses and items for correlation with performance in the module:
  • Changes in entry requirements
  • Changes in actual pre-entry educational attainment (e.g. CAO points (ROI), A Level score (UK)) of the cohort.
  • Standards achieved pre-entry in core subjects such as mathematics.
  • Changes in size of class group
  • Changes in origins of class group (are all native speakers of the language of delivery?)
  • Gender, Age, Educational and Social characteristics, Entry route and attendance type profiles1
  • Range of marks used over time in assessing the module


Question: Student retention in my field is poor, I understand some of the reasons why but I want to address the problem and need a comprehensive picture of what is happening.
Possible analyses and items for correlation with performance in the subject:
  • What is the student profile now, how has it changed and how is it likely to change in the future?
  • What are the particular programme elements contributing most consistently to non-completion.
  • What do the students think?
  • Are student expectations of the programme realistic prior to entry?
  • Do entry requirements need to be recalibrated based on changes in standards or curricula outside the Institution?
  • Would a change in programme content, providing extra support in problem areas, help students to progress?

This second example, relating to factors that may affect student retention, is of key importance and an excellent example of how different data resources can be readily drawn together by Institutional Researchers in a manner that may be very difficult for individual academics.

Undertaking a project to assess factors affecting student retention is a daunting task, not least because of the breadth of factors that may or may not be included in the research. The first task is to get a general understanding of the institution, the subject area and the departmental and broader environments in which students are operating. The following case study illustrates just such a project and how it might work, based on a current Institutional Research project in DCU.

Case Study/Sample Project: Attitudes, experiences and characteristics influencing student progression - A DCU pilot for assessing the impact of diverse factors through the first year of study

This project is a good example of how the combined skills and resources of a dedicated Institutional Research facility, working with other experts in the institution, can contribute to an understanding of the dynamics affecting student progression and completion. While initially aimed at the institutional level, the methods as well as the results can be applied and employed according to the needs of the individual academic. Jointly run by the Institutional Analysis Officer and the First Year Student Support Facilitator in DCU, the project draws together information held on the central student record system and information derived from a series of three student surveys.

The surveys, run at the beginning, mid-point and end of the academic year, are not anonymous. This is made clear to the students at the point when they agree to participate in the study and sign an agreement in line with the Data Protection Acts 1988 and 2003 having been furnished with an outline of the project. The surveys track changing attitudes as well as academic progress through the first year. In addition to identifying key factors impacting on progression, the study is intended to weed out factors that do not actually have any significant effect on student progression so that remedies and initiatives can be focussed on the most significant influencing factors.

The study, while not focussed on a specific programme, has the input of a number of academics and is intended to provide guidelines for identifying and addressing emerging retention problems within programmes of study.

The aims of the project, in brief, are as follows:

  1. To explore a wide range of aspects of the experience of undergraduate students with the specific purpose of identifying factors that may influence programme completion.
  2. To ascertain the factors and relationships determining the qualitative nature of the student experience while in DCU.
  3. To explore the interrelationship between pre-entry expectations and experienced reality of the university experience.
  4. To refine understanding of the relevance of different factors affecting student retention, with a view to focussing efforts and resources on the most potent influencing factors.

(O’Flanagan and Crehan 2004)

Data Used for the Study:

The first survey, taken at point of registration includes the following core elements:

  1. Biographical Detail,
  2. Self evaluation of personal characteristics; including tenacity, mathematical and writing ability, ambition, academic ability and self-confidence,
  3. Factors affecting the decision to study at University,
  4. Level of prior understanding of the programme,
  5. Anticipated time spent on specified work, study and social activities,
  6. Difficulties anticipated,
  7. Perceived locus of responsibility for learning and the role of the lecturer,
  8. Priorities while at University, academic ambitions and career goals,
  9. Family educational background,
  10. Financial concerns,
  11. Perception of the experience of studying at Higher level in practical terms, and
  12. The anticipated best and worst elements of the experience of study at University.

The second survey reviews issues assessed in the first including:

  1. Self evaluation of characteristics,
  2. Level of prior understanding of the programme,
  3. Actual time spent on specific activities,
  4. Difficulties encountered,
  5. Perceived locus of responsibility for learning and the role of the lecturer,
  6. Priorities while at University, academic ambitions and career goals,
  7. Financial concerns, and
  8. The best and worst elements of the experience thus far.

New issues covered in the second survey include:

  1. Self identified changes in perception of study at Higher level having spent six months in the University,
  2. Support services accessed, and
  3. Integration into campus life/sense of belonging.

The final survey revisits the items covered in the second survey and includes a sub-module addressed to those who have chosen to change programmes, defer or withdraw from the institution.

A key element of the study is the combination of the data gathered through the surveys with information stored on the student record system. In addition to aggregate completion rates, individual level data elements are taken from the official record and include:

  1. Academic history including second level results (Leaving Certificate (ROI), A-Levels (UK) or other national equivalents), institution attended, and level of preference for the course onto which participants were accepted,
  2. Entry route (central clearing house (Central Applications Office in ROI) or direct entry on the basis of age or other specified characteristics),
  3. Modular exam results achieved through the year, including continuous assessment marks,
  4. End-of year results,
  5. Other official items of record including withdrawal and reasons for withdrawal, changes in optional programme elements and transfer, and
  6. Completion rates at the institutional and discipline level.
Approach and Outcomes

Using a paper based OMR readable questionnaire in the first instance, followed up with online surveys, the data were collected and stored in an SPSS database. Using the student number provided by respondents, records drawn from the University’s student records database (MIS) were linked and matched to these records to create a master file combining both datasets. The MIS data in this master file were updated throughout the year following registration and examinations.

The files in which the data are stored and linked are maintained only on a single hard drive and are not accessible via the University’s networks. The data collected in the surveys cannot be accessed by anyone other than the researcher and cannot be linked back into the University’s MIS.

Based on the responses to the survey in the first pilot year, the study is being repeated in 2004/05 using new versions of the questionnaires based on the responses, and analysis of same, in the first year. The objective, based on the implementation phase as well as analysis of the available data, is to refine the tool down to a shorter questionnaire and database tool that can be used to quickly approach and assess factors affecting emerging student retention.

So far, the data have been analysed at Institution, Faculty and Programme levels where there was sufficient information to do so and the results communicated, in summary form, to the relevant managers. Further analysis was made available on request, including the use of additional data from the MIS if testing of additional hypotheses was warranted.

Conclusions

Using Institutional Research

Evaluation based on statistical techniques can be daunting at first and, when unfamiliar, even frightening. Quantitative analysis has languished in the cold, to some extent, because it has been seen to be ‘incomplete’ and lacking in perspective. There is no doubt that statistics need to be interpreted, however, in an increasingly evaluative culture, as is the case in Ireland at present, avoiding quantitative approaches is not only blinkered, but also counter-productive as it means losing out on high quality support tools that can contribute greatly to reflection, understanding and development.

There are a host of analyses that can be done using modern databases, computerised techniques and skilled researchers that were unavailable to academics in the past, or at least much more difficult to undertake or access. Before embarking on any form of quantitative analysis it is important to consider the origin and quality of the data to be used. If you have an Institutional Research facility available to you, or a student records or comparable office, it is worth exploring with colleagues in those services exactly what data are available, where the data came from, what legal restrictions or implications may pertain to use of the data, what comparable analyses are available for benchmarking purposes should you require that and what level of reliability testing may be required. If benchmarking, it is important to ensure that data from external sources is of the same standards of quality and accuracy as the data sourced within your institution.

Reports produced internally, for internal or external purposes, are a good starting point when familiarising yourself with your institutional research function, if your institution has developed one already, and will generally suggest analyses available according to the types of data included in the reports. In the case of DCU, the provision of such reports generally results in further, more detailed, queries specific to individual programmes, modules or student cohorts. This is where the value of Institutional Research to the individual academic comes into it own.

References

   Henkel, M. (1991). Government, Evaluation and Change. London: Jessica Kingsley.

   Johnes, J. and J. Taylor (1990a). Degree Results: Differences between Universities. Performance Indicators in Higher Education, 109-118.

   Johnes, J. and J. Taylor (1990b). The clamour for performance indicators. Performance Indicators in Higher Education, 1.

   Light, G. and R. Cox (2001). Evaluating: Teaching and course evaluation. Learning and teaching in Higher Education: The reflective professional, 195-216. Mission statement, Office of Institutional Research, University of Florida.

   Morgan, M., R. Flanagan, and T. Kellaghan (2001). A Study of Non-Completion in Undergraduate University Courses. Higher Education Authority, Dublin.

   OECD (2004). Review of Higher Education in Ireland.

   O’Flanagan, M. and M. Crehan (2004). Attitudes, experiences and characteristics influencing student progression - A DCU pilot for assessing the impact of diverse factors through the first year of study. In S. Moore (Ed.), Proceedings of the Irish Retention Network Colloquium, Ireland, pp.  1. forthcoming.

   Skilbeck, M. (2001). The University Challenged - A Review of International Trends and Issues with particular reference to Ireland. Higher Education Authority, Dublin.

 

1Note: this type of analysis would be aimed at identifying if the pedagogical approach is appropriate to the students’ prior experience. For example, it might indicate that the cohort profile has shifted towards older learners to whom the existing pedagogical approach may not be appropriate.