Governance and Execution 8 Minute Read

Course Facilitator Feedback Surveys

Computer and books

Surveys can be a valuable tool for the collection of honest feedback related to the design and implementation of an online course. Although most people are familiar with the importance of student feedback surveys, course facilitator feedback surveys can provide an equal amount of valuable data regarding the online classroom.

Because they’ve interacted with students within the course shell, course facilitators often see flaws that others may miss during the design process. They’re also able to provide feedback on areas where students struggled to understand directions or complete specific tasks. The collected survey data from course facilitators can inform course writers, instructional designers, program directors, and department chairs of specific changes they can make to courses to enhance and better support student learning.

This article gives administrators tips on how to survey course facilitators effectively and how to analyze the data they receive from those surveys.

Method of Surveying Course Facilitators

Institutions should send course facilitator feedback surveys via e-mail following the completion of the term. Some institutions elect to send the survey as an attachment to the e-mail; however, this method is not recommended because the data received would then need to be re-entered into another format to allow for analysis and comparison. Instead, institutions should include a hyperlink to the survey within the body of the e-mail to reduce time spent on data entry. It’s imperative that the department chair or program director encourage course facilitators to complete the survey by stressing the importance of the survey data and its use in later enhancements of the course. In our experience, response rates tend to increase when the department chair or program director remind course facilitators of the survey release date.

Survey Software

Several platforms and software programs are available for online survey delivery. Survey Monkey, Google Forms, and an institution’s learning management system (LMS) are some of the most common applications used for course facilitator surveys. Using the survey feature on the LMS is recommended; however, whichever software the institution chooses, it is important to remember that the software is just an aid; it should not impact the content of the survey (Gaide, 2005).

Formatting Questions for Course Facilitator Surveys

Survey questions should be designed so that responses can help identify specific opportunities for course improvement. These questions should be both quantitative and qualitative in nature to provide a variety of data for further analysis and interpretation. Questions should focus on course objectives, course assignments and grading, course content (e.g., textbook and resources), and student workload. For an example of a course facilitator survey, see the appendix section below.

Course Identification Questions

To better assist course facilitators with future course design and development, surveys should include questions concerning the course identification. These questions should be included at the beginning of the survey and be required for survey completion. Questions should include the name of the of the course facilitator, the course code and name, and the session number. For extra data, the survey may require the facilitator to include the date he or she completed the survey and his or her e-mail address.

Likert Scale Prompts

The first step in forming survey prompts is determining what information you are seeking (Stillwagon, 2017). To measure latent constructs (i.e., attitudes and opinions) and sentiment about a specific topic, a Likert scale is recommended. A Likert scale is a universal tool used in data collection in online education surveys because it provides quantitative, granular feedback. The scale helps you see areas that need improvement in addition to highlighting areas of success.

A Likert scale is a five (or seven) point scale that allows individuals to express how much they agree or disagree with a statement or question. They often offer a range of responses from “strongly disagree” to “strongly agree” with a neutral midpoint (e.g., 1 = “strongly disagree,” 2 = “disagree,” 3 = “neutral,” 4 = “agree,” and 5 = “strongly agree”). As you determine your format, it is important to remain consistent throughout your survey (e.g., each prompt should be five points).

Each statement or question should have a singular focus. For example, a prompt should not ask the course facilitators if the course work was both an appropriate amount and on topic. Instead, the survey should have one prompt about whether there was an appropriate amount of content and another about whether the content was on topic. Multi-topic questions often lead to confusion and inaccurate results.

Likert scale prompts should also refrain from using negative words or phrases. For example, instead of saying, “course materials were not an appropriate amount of work for students,” phrase the prompt without using the word not.

Examples of Likert Scale Questions

The objectives of this course are consistent with program outcomes.

  1. Strongly Disagree
  2. Somewhat Disagree
  3. Neutral
  4. Somewhat Agree
  5. Strongly Agree

The assignments in the course support students in achieving course learning outcomes.

  1. Strongly Disagree
  2. Somewhat Disagree
  3. Neutral
  4. Somewhat Agree
  5. Strongly Agree

The amount of work required of students was appropriate.

  1. Strongly Disagree
  2. Somewhat Disagree
  3. Neutral
  4. Somewhat Agree
  5. Strongly Agree

This course provided students with the opportunity to interact with one another and learn from one another.

  1. Strongly Disagree
  2. Somewhat Disagree
  3. Neutral
  4. Somewhat Agree
  5. Strongly Agree

The rubrics in this course were clear and provided guidance for how to score assignments.

  1. Strongly Disagree
  2. Somewhat Disagree
  3. Neutral
  4. Somewhat Agree
  5. Strongly Agree

Open-Ended Prompts

Per Wasik and Hindman (2013), open-ended prompts are often questions or statements that allow multiple responses and multi-word responses. Course facilitator surveys should conclude with an open-ended section that allows course facilitators to provide qualitative feedback. These surveys should also include open-ended prompts following a Likert scale prompt. For example, an open-ended prompt may ask a course facilitator to further explain the reasoning behind his or her responses.

Open-ended prompts are useful for obtaining in-depth information on topics with which you may not be as familiar or aware. When you create an open-ended prompt, there should not be specifics surrounding the question asked (Wasik & Hindmand, 2013). These prompts give a new perspective on course topics, allowing for different interpretations and a variety of responses. This written feedback provides program directors and department chairs with more direction and insight into steps for course improvements.

Examples of Open-Ended Questions

  1. Which assignment or activity did you feel was the most valuable for students, and why?
  2. Please provide any additional feedback below.

Analyzing and Interpreting Course Facilitator Survey Data

Course facilitator survey data creates a valuable, even essential, feedback loop for institutions that streamline course writing and course facilitation. However, to take full advantage of the surveys, institutions need to analyze and interpret the results.

One of the advantages of Likert scale data is that feedback is quantitative; therefore, it’s easier to interpret results. The most convenient way to interpret Likert scale data is to average the values for each prompt (e.g., all participants’ responses to the first prompt). On a five-point scale, for example, prompts with averages lower than 3 would require further examination. Another approach concentrates on calculating the median of the responses for each prompt. It is important to remember that the number itself does not provide valuable data. The interpretation of that number does, however.

One of the disadvantages of Likert scale data is that it does not measure emotional distance between the responses. It also doesn’t allow for more in-depth feedback. However, by using open-ended prompts and Likert scale prompts cohesively, feedback is optimized for greater course development and instruction.

However, unlike Likert scale prompts, open-ended prompts are difficult to interpret. To analyze survey data from open-ended prompts, you must look for patterns. First, review the results quickly to look for repeated words and phrases. This will provide a quick overview of the results. Voyant Tools allows you to upload a document, and it will provide an overview of the most common phrases and words used.

Once you have identified common phrases, determine if you can group any of them together or break them down further. Creating separate categories for these phrases allows for a deeper examination of a specific topic for further interpretation. Program administrators can then determine how best to address common themes, or dismiss anomalous feedback.

Conclusion

Course facilitator surveys provide program directors and department chairs with powerful information they can use to improve the course design and student learning, thus strengthening the institution. These surveys can be used at the program level to determine course design features, course enhancements and improvements, content development, and program expansion. They also provide course writers and instructional designers with suggestions for improvement in course development, content, and navigation.

By optimizing the question types and encouraging course facilitators to participate appropriately, institutions can leverage these opportunities for feedback to benefit everyone involved in the online course. With use of proper evaluation techniques, course writers, instructional designers, department chairs, and program directors will be aware of not only what to improve but also suggestions on how to improve it.

References

Gaide, S. (2005). Evaluating distance education programs with online surveys. Distance Education Report9(20), 4–5.

Stillwagon, A. (2017). How to analyze and interpret survey results. Retrieved from https://smallbiztrends.com/2014/11/how-to-interpret-survey-results.html

Wasik, B. A., & Hindman, A. H. (2013). Realizing the promise of open-ended questions. Reading Teacher67(4), 302–311. doi:10.1002/trtr.1218

Additional Resources

Rao, K., Edelen-Smith, P., & Wailehua, C. (2015). Universal design for online courses: Applying principles to pedagogy. Open Learning30(1), 35–52. doi:10.1080/02680513.2014.991300

Smyth, J. D., Dillman, D. A., Christian, L. M., & McBride, M. (2009). Open-ended questions in web surveys. Public Opinion Quarterly73(2), 325–337.

Wall Emerson, R. (2017). Likert scales. Journal of Visual Impairment & Blindness111(5), 488.

Posted January 31, 2018
Author Samantha Bir