Universal Student Ratings of Instruction (USRI) Archive
Collect feedback that counts
Universal Student Ratings of Instruction (USRIs) gathered feedback from classes to help instructors, departments and faculties improve curriculum and instruction. The results also served as one important factor in decisions affecting the career of instructors.
The collection of USRIs was regulated by the GFC policy section 113.Teaching, Learning and Evaluation Policy.
- When did USRIs occur?
- What was the USRI process?
- When were USRI results available?
- Where can instructors find USRI Instructor results?
- Are USRI results made public?
- What does the USRI report consist of?
- What does the USRI instrument look like?
- USRI Reference Data
- What provisions are made for student anonymity in the USRI system and process?
- When did USRIs occur?
- What should I do if I have problems logging into the USRI system?
When did USRIs occur?
Generally, USRIs were available for students to complete once the withdrawal deadline for classes has passed, and will be available until the last day of classes.
What was the USRI process?
Step 1: Prior to the start of the rating period, instructors received an email announcement with further information and important dates for the rating period.
Step 2: Once the announcement email was received, instructors notified students that they would be receiving an email with instructions and encouraged participation..
Step 3: When student ratings became available, students received an email with instructions and appropriate links to complete their ratings.
Step 4: During the rating period, instructors received a helpful reminder from TSQS to encourage student participation.
Step 5: Students who had not yet completed their ratings received an initial reminder via email.
Step 6: A secondary email reminder was sent to any students who had not yet completed the rating.
Step 7: Once the rating period was complete, instructors were able to view results online through the USRI Instructor Reports.
When were USRI results available?
Instructors received rating results within 20 working days after the course was completed, and once the dean, director or chair signed the grade sheet.
Where can instructors find USRI Instructor results?
When your course grades are finalized, numerical reports, and commentary reports can be accessed at: SPOT and USRI Survey Reports »
- Log in with your CCID and password.
- Under the "Reporting" navigation, select "My USRI Reports".
Please note if your class has less than 10 students, only the commentary report will be available.
If you have comments or concerns about this process, please contact us at test.scoring@ualberta.ca.
Are USRI results made public?
The current version of this policy may be viewed at the Teaching, Learning and Evaluation Policy.
On October 12, 1993, the General Faculties Council (GFC) of the University of Alberta modified its policy concerning Teaching Evaluation and Student Evaluation of Instruction to include the requirement for the collection of students' ratings of instruction on a University-wide basis using a basic set of mandated questions. The policy also made provision for releasing the associated results to the Students' Union and the Graduate Students' Association. Currently, results are not made "public" unless there have been at least 10 completed questionnaires for a class.
In 2011, online access to results was restricted to registered students. This was followed by allowing instructors to see results for their own classes and, later, providing access to Deans, Directors, and Chairs to view results for their employees. Beginning in July 2012, Deans, Directors and Chairs may extend this access to individuals whom they designate by sending an e-mail to test.scoring@ualberta.ca
which provides:
- Their name and employee number
- The name and employee number of the person to whom they wish to grant designated access
Login to Search for USRI Results
What does the USRI report consist of?
A one-page report is generated for each class from which students' ratings have been collected. The Instructor Report contains the text of each of the rating questions appearing on the questionnaire. The questions are reported in the sequence that they were printed on the questionnaire. Following the text of each question, the number of students responding to the rating scale Strongly Disagree (SD), Disagree (D), Neutral (N), Agree (A), and Strongly Agree (SA) are reported. These frequencies are followed by the median of the responses and reference data .
Additional USRI reports available.
What does the USRI instrument look like?
USRI Reference Data
Reference Groups for Comparative Ratings
The columns of reference data display statistics from Tukey's box-and-whisker plot analysis (John W. Tukey, Exploratory Data Analysis, Addison-Wesley Publishing Company, Inc. 1977). The values displayed are derived from all classes in the indicated reference group. These statistics are chosen to achieve two main objectives:
- To summarize skewed distributions of data, and
- To identify outliers from the general population, if they exist.
The median value (middle of a ranked set of numbers) is generally preferred over the mean to identify the centre of a skewed distribution of scores. This is the value below which 50 percent of the medians from other classes lie. Please note that data for the items in the current set of mandated questions are accumulated from Academic Year 2005/06 and beyond. If an item (question) has not been used at least 15 times by the indicated reference group since then, the reference data cells will be filled with the text: too few uses. It is theoretically possible for all median scores in a single year to be above, or below, the Reference Group median.
The 25th and 75th percentiles provide information about the spread of scores around the median. By definition, 25 percent of the scores are above the 75th percentile and 25 percent are below the 25th percentile. Since this occurs by definition, these values should not be used to determine whether a particular score is good or bad.
The lower Tukey Fence, which is the 25th percentile minus 1.5 times the distance from the 25th to the 75th percentile, defines a reasonable limit beyond which a score can be considered an outlier. Outliers are scores that appear to be outside the usual distribution of scores for the population being tabulated (i.e. for the indicated reference group.) Given the nature of the USRI data, the upper Fence will usually be above 5.0 and, therefore, need not be reported.
Please note that some items can be expected to elicit higher ratings because they are closer to apple pie types of items (i.e. we would expect the item to be rated quite positively.) This is illustrated by the campus-wide results accumulated in the years 2000-2004 for the two items shown below.
Item | Tukey Fence |
Reference Data | ||||||
25% | 50% | 75% | ||||||
The instructor treated students with respect. | 3.4 | 4.3 | 4.6 | 4.8 | ||||
Overall, the quality of the course content was excellent. | 2.9 | 3.8 | 4.1 | 4.3 |
This suggests that the median obtained for the first item in a particular class can be expected to be 0.5 of a rating above that for the second item simply because that has been found to be the case in results from thousands of classes surveyed at the University of Alberta. Note that the 25th percentile for the first item corresponds to the 75th percentile for the second item.
Also, the reference group used for a particular class consists of all classes in the indicated department or faculty. One of the most consistent findings of researchers studying students' ratings of instruction is that the ratings obtained for items such as those addressing general satisfaction with a course or instructor, depend on the discipline in which the course is taught. Franklin and Theall (1995) reported that "professors in fine arts, humanities, and health-related professions are more highly rated than their science, engineering and math-related colleagues." There appears to be a combination of reasons for these differences including diversity in the characteristics of the students, in the nature of the subject matter, and in the course objectives that are emphasized in different disciplines. The sizes of the differences and the conclusion that they are not necessarily related to characteristics of the instructors in the different disciplines, leads to the advice that "we must continue to be very cautious about —if not prohibited from —using the results of student evaluations to make comparisons across disciplines." (Marincovich, 1995).
For example, the item "Overall, this instructor was excellent" illustrates that results at the University of Alberta are consistent with the research studies. The reference data from some of the departments in which a large number of classes have been surveyed appear in the following table.
Department | Tukey Fence |
25th percentile |
Median | 75th percentile |
Physics | 2.4 | 3.7 | 4.1 | 4.5 |
Computing Science | 2.5 | 3.7 | 4.1 | 4.5 |
Electrical & Computer Engineering | 2.7 | 3.9 | 4.2 | 4.6 |
Mathematical & Statistical Sciences | 2.8 | 3.9 | 4.2 | 4.6 |
Earth & Atmospheric Sciences | 3.0 | 4.0 | 4.3 | 4.6 |
Biological Sciences | 3.1 | 4.0 | 4.3 | 4.6 |
English | 2.8 | 4.0 | 4.4 | 4.7 |
Modern Languages & Cultural Studies | 2.9 | 4.0 | 4.4 | 4.8 |
History & Classics | 3.4 | 4.2 | 4.5 | 4.7 |
Elementary Education | 2.7 | 4.0 | 4.5 | 4.8 |
Drama | 2.9 | 4.1 | 4.7 | 4.9 |
What provisions were made for student anonymity in the USRI system and process?
GFC Policy 111.3 D, now rescinded, stated the importance of student anonymity in completing course and instructor survey questions. Free expression of views in the ratings is essential, so long as the safety of the members of the university community is upheld. The following indicates measures which ensured anonymity in the process:
- The survey administrator cannot identify the student through the survey tools unless the student self-identifies.
- The survey tools are truly anonymous.
- Your ID/username does not get tagged on the survey results.
- You must log in for verification that you have taken, partially taken, or not taken some or all of your surveys. Again, your answers to survey questions are completely separate from this verification.
- Circumstances that warrant overriding anonymity are spelled out in GFC policy 111.3 D (see above). Threats to the safety or well-being of members of the University community will not be tolerated, and would result in actions in order to identify the author of the statements according to GFC policy.
- These surveys or ratings are conducted so that "the results help instructors and departments or faculties to initiate constructive change in curriculum and instruction. In addition, the results are one important factor in decisions affecting the career of your instructor." (GFC policy 111.3 C).
How did USRIs support ratings if there were multiple instructors for a course, as in the case of team-teaching?
In a case such as this, student ratings were configured to provide numerical and open-ended questions that repeat for each instructor. The questionnaire was arranged such that the questions that apply to the overall course appear first. These are followed by questions that apply to the instructor. Instructor-related questions are repeated for each instructor involved in the course.
When results were compiled and ready for viewing, each instructor will receive the questions that apply to the overall course and their individual results. This method eliminates the need of having separate questionnaires generated for each instructor in a team-taught class in order to achieve confidentiality among instructors.
Example of a team teaching questionnaire
What should I do if I have problems logging into the USRI system?
If you have difficulty logging into the ratings system please try the following:
- Clear your browser cache. For assistance on clearing your browser cache refer to the following help article: How to Clear Browser Cache and Cookies.
- Check the status of your CCID to ensure it is valid and functioning properly by accessing https://myccid.ualberta.ca/
check .
If you are still having problems logging in, please contact the Staff Service Centre.
Employees, Students
Technology Support