/ Provost's Office

Course Evaluation

Hope College has always had effective, energizing instruction as one main priority. It is this spirit that guides the desire to continuously develop the instructional skills of faculty and academic staff.

Course evaluations are completed through EvaluationKIT, a system that integrates with Moodle.

View Student Learning Experience (SLE) Course Survey 

Divisions, departments and instructors may add additional course-specific questions to the surveys. Results of instructor-created questions will be viewable only to the instructor. This allows students to have only one evaluation per course, and yet still allow divisions or individuals to survey for perceived student learning outcomes or course innovations.

The new platform, EvaluationKIT, was selected after a group of faculty, staff and administrators viewed several products to replace the scantron-based SIRS, which were provided by ETS. Given that ETS no longer provides this administration, this group had to select a new service to provide course evaluations — keeping perspectives from diverse stakeholders at a forefront. EvaluationKIT allows us to stay aligned with the current evaluation policies and procedures outlined in Hope’s Faculty Handbook. Additionally, it provides opportunities for departments, divisions and individual instructors to have questions that will not move above the particular level (departmental, divisional, individual). It is the spirit to keep advancing in the realm of holistically developing our students that will forever be a guide in how we approach instruction at Hope. We see student course evaluations as one piece of the guide. 

TIMELINE - Spring 2021
  • Late December: Course evaluation system opens for faculty to add custom questions
  • January and February: Divisional dean's assistants confirm a correct survey distribution list
  • February 10: Pre-survey email notification sent to instructors 
    • If you note any errors in the course(s) being evaluated, send a note to your divisional dean's assistant. If you do not have an academic department, email the system administrator (Laura McMullen) at mcmullenl@hope.edu.
  • March 1–March 12: First-half semester courses evaluated
  • Two weeks before survey start date: Pre-survey email notification sent to instructors
  • Five day before survey start date: Pre-survey email notification sent to students
  • Survey start date: Invitation sent to students to complete evaluations
    • Response rates become available to instructors in EvaluationKIT.
    • Students will receive a certificate of completion when they have completed a course's survey.
  • April 23-May 7: Second half-semester and full semester courses evaluated
    • Please leave time in class for students to complete these evaluations. If these dates don't fit your course schedule, contact your divisional dean's assistant to adjust the evaluation dates for your course.
    • Students will receive occasional email reminders for any outstanding evaluations.
  • June 2: Results released (to instructors, chairs and deans)
    • Results will not be released to instructors if final grades have not been submitted.
    • Results associate with SALT or custom questions will be released to the instructor only.
How to access EvaluationKIT

You can access EvaluationKIT in two ways:

  1. Through an email sent to you from the EvaluationKIT system. An email will be sent to you when surveys are launched or results are available. Do not share this email or link with anyone as it is unique to you. It acts as both the door and the key to your EvaluationKIT survey experience.

  2. Through Moodle. Instructors will see a link to EvaluationKIT on the right side bar of their Moodle homepage, under the text "EvaluationKIT User Access." If you do not see this link on your homepage, try adding the EvaluationKIT block to your Moodle Dashboard page.
Best Practices for Conducting Course Evaluations

EvaluationKIT is integrated with Moodle. So when students log in to Moodle, they will get a reminder to complete the course evaluation surveys.  Additionally, they will receive an email with a unique link for completing the table of survey(s) for their course(s). We recommend announcing that this email is coming. The platform will send an email to instructors a few days before it sends its initial ones to students. 

  • Course evaluations will be sent to students two weeks before the end of class.
  • Instructors should pick a day to leave 5–10 minutes of in-class time for the completion of course evaluation (longer if custom questions have been added). You can do this if you are teaching in a face-to-face, hybrid, or online format (students can complete the survey on any digital device).
  • Instructors should confirm that students can access the survey through email or Moodle.
  • Instructors should leave the room while students complete the survey.
  • Students can access the evaluation in two ways:
    • Through an individualized email  sent on the evaluation launch date (typically a week before the end of class). This email will contain a list of all evaluated courses and a link to each course evaluation. Students can find this email by searching their 1.hope email for the subject line "Link to Your [Semester name & year] Course Evaluation(s)" (for example: Link to Your Spring 2021 Course Evaluation(s) ). The sender will be hopecoursesurvey@hope.edu.
    • By logging in to Moodle any time after the evaluation launch date and up until the evaluation close date (typically the last day of class, the week before finals).

Overall, we want to continue to provide opportunities for students to share their feedback with instructors. We also note that student evaluations are just one piece of feedback in regards to formal instruction. We deeply value other critical components that will help instructor development such as trained peer review, reflection essays by instructors, and discussions after visits by department chairs and divisional deans.

Course Evaluation Parts & Questions

Course evaluations may contain multiple parts:

  1. College-wide survey instrument
    Results are viewable to instructors, their course supervisors, and their divisional deans.
  2. Divisional or departmental questions
    These survey questions are created at a departmental or divisional level and may be given to a select set of courses (for instance all lab courses or all 290 courses) or a full department/division. Results are viewable to instructors, their departmental chairs, and their deans.
  3. Instructor questions
    1. From the SALT question bank
      You may select questions from the SALT question bank in EvaluationKIT. Your individual results will be reported only to you.
      Data from the SALT question bank will be aggregated and visible to others at Hope College. Thus you will be able to compare your results to aggregated college-wide averages. Aggregate data may also be used for departmental, program-specific and college-wide conversations aimed at improving student learning at Hope College. Comment sections will not be visible to others
    2. Instructor-created questions
      Any questions you create yourself will not be accessible to others. 
I used to administer SALT to my courses. I see that this is no longer available. What do I do?

EvaluationKIT allows instructors to add any Student Assessment of Learning and Teaching (SALT) questions to their course survey. This provides students with just one survey for each course.

On the reporting side, the results of the instructor-generated questions will be viewable to the instructor only and will not be viewable to the chair or divisional dean. This is true for questions selected from the SALT question bank as well as instructor-created questions.

If a question is added from the SALT question bank, the instructor will be able to see his or her course rank as well as overall averages for that question across the institution. If a question doesn’t exist in the SALT item bank, the instructor can create his/her own question or, if the instructor feels it would be a useful question for the campus, he/she may suggest it for inclusion in the SALT item bank by emailing hopecoursesurvey@hope.edu.

These questions are available for instructors’ use, allowing instructors to keep a pulse on the effectiveness of course design and better understand the impact of multiple, new, and ever-changing instructional modalities in the formal classrooms (both virtual and physical).

How do I add my own questions?

Adding Custom Questions (Google Doc) Follow the instructions in this Google Doc to add custom questions to your survey. You may select questions from a bank of questions or create your own question.

Keep in mind that these questions will be added to the college-wide survey. To reduce survey fatigue in students, limit additional questions to those that target your specific course or objectives and be sure they do not replicate questions on the main survey instrument.

Why course evaluations?

The EvaluationKIT system allows us to survey a broad selection of courses. Integration with Moodle means that all courses are available to be surveyed in EvaluationKIT. By default all courses will be evaluated, and nothing more needs to be done on your part. You will receive emails from hopecoursesurvey@hope.edu confirming your course participation.

Unlike in years past, course evaluations are available to all non-tenure track and adjunct professors. We hope this allows all at Hope to continue to develop their craft and strive for improvements in classroom pedagogy.

Promotion, Tenure, and Review Guidelines

The survey is being administered in accordance with current faculty handbook policies. Any faculty member going up for tenure, promotion or third year review must have the previous three semesters of course evaluations on hand for the review process. If you are a pre-tenure faculty member and are contemplating extending your tenure clock, you should discuss options and potential outcomes with your departmental chair and dean.

Best Practices for Small Courses

Very small courses (less than five students) don’t give reliable results. While these courses will still be surveyed, their results will be considered unreliable measures.

In cases where like-courses can be grouped together for more statistically significant results, we will work with you and your department chair to do so.

If you would like to opt-out of having a course or courses evaluated, please discuss this with your department chair or program director. If you are not housed in an academic division (example: some FYS instructors) contact hopecoursesurvey@hope.edu or the director of your program.

TRAINING SESSIONS FOR INSTRUCTORS

EvaluationKIT Basics (Recorded Monday, October 12 at 3 p.m.)

Watch a recording of the EvaluationKIT Basics Workshop 

  • An overview of the user experience
  • Differences between this and the former system(s)
  • Best practices for course evaluation distribution
  • How to access EvaluationKIT as an instructor through Moodle
  • Open question and answer time

EvaluationKIT Custom Questions Workshop (Recorded Friday, October 16 at 11 a.m.)

Watch a recording of the EvaluationKIT Custom Questions Workshop 

  • How to create custom questions within EvaluationKIT (and why you might want to)
  • Why use survey bank questions vs. creating your own custom question
  • Open question and answer time