Pedagogy 9 Minute Read

Using Data to Improve Your Online Course

Computer and books

When you enhance an online course you’ve previously taught or developed, you have a wealth of experience on which you can base your decisions. Your knowledge of the student population, the lessons you’ve learned from their feedback, and other anecdotal information can serve as a worthy guide when it comes to pinpointing just what you want to change.

What often goes overlooked, however, is the data that’s saved in the learning management system (LMS), which can be an invaluable tool for determining which elements of your course need attention. Hung and Zhang (2008) note that course data provides a powerful way to learn patterns, preferences, and progress related to student achievement. When used appropriately, course data can give a clear, unbiased picture of how students engage with your course elements.

In this article, we’ll discuss a few different ways you can mine this type of data in your online course and how to use it to improve your students’ learning experience.

Before You Begin

As a quick note of caution, you might not have access to all the tools discussed here, and that’s OK. Most LMSs have some limitations, so it’s possible—likely, even—that you’ll have to get creative with how you find course data. With that in mind, consider and make the most of the following tools when determining what changes to make to your course.

Also, when analyzing course data, remember that correlation does not necessarily imply causation. Just because two variables appear to relate to one another does not mean that one causes the other. For instance, if you see a drop in student engagement during a certain week, it doesn’t necessarily mean that your materials didn’t engage students. It could mean that students had deadlines to meet in other courses, or perhaps another assessment you have on the horizon demanded more of their attention than you anticipated. Use whatever context you can to look for correlations between your course data and the conclusions that you draw from it.

Now, let’s look at some sources of course data in your online course.

Student Evaluations

Student evaluations refer to the surveys students receive and respond to at the end of their course. These evaluations aren’t an end-all when it comes to determining what works and what doesn’t, but they provide valuable insight into your students’ satisfaction in the course and their perception of your courseware and instruction. Although you’ll likely see outliers in both directions, overall, evaluations can help you gauge how well students think the course meets their needs.

When looking at survey results, try to identify patterns in students’ responses. For instance, if the average rating for students’ satisfaction is a 2.2/5, it’s probably worth noting. Conversely, though, if an average has dropped significantly because of one student’s extremely low score, you don’t need to be as concerned. Patterns can help indicate the opinions of the student body at large, not just those who have a strong opinion one way or the other.

You should also consider your course’s context. (If you need a refresher on the course design triangle, including course context, you can find one here.) For example, if you have a small course, outliers will more heavily impact average scores on evaluations. Similarly, if students struggled in your course, they may be more critical during the evaluation process. This does not mean, however, that you should not challenge your students. Rather, consider how these results fit into the overall context of your course and identify adjustments as needed.

Finally, it’s possible that your institution’s survey contains both quantitative and qualitative data. While quantitative data serves as an important indicator of how students perceive your course, qualitative data also offers valuable insight. When reviewing students’ comments, note averages and outliers, just as you did for quantitative data. Also take time to consider these comments in the context of your own teaching philosophy. Re-read your objectives to reflect on what you wanted students to get out of the course. Did they get it? Does their success align with your course goals? Consider areas of discrepancy as you rework the next iteration of your course.

Activity Logs

Black, Dawson, and Priem (2008) note that the activity log files in LMSs provide one of the most promising sources of automatically gathered e-learning data. Many LMSs will provide instructors with data on which elements students clicked on, how often they visited each element, and even how long they remained on each element. This information can be particularly helpful when compared with how much time you think students should spend on a particular resource.

For example, if students spend only a minute or two on a much lengthier and in-depth piece of instructional material, you might consider how you can break it up or make it more engaging. Similarly, if students spend too much time on a simple resource, you might examine whether the layout is confusing or problematic.

When reviewing your activity logs, look for the following behaviors and consider the corresponding strategies for how you might be able to encourage improvements.

BehaviorImprovement Strategies
Students not clicking on a resourceRevise your module introduction.

Mention the resource in your assessment directions.

Remind yourself to direct students to the resource in discussion forums, LMS chats, etc.
Students spending too much time on a resource Ensure that the resource is cognitively appropriate for your students.

Rewrite navigation instructions to ensure clarity.
Students spending too little time on a resource Ensure the module introduction and relevant assessments mention the importance of the resource.

Intentionally foster value and expectancy in the resource itself.
Students spending too much time on an assessmentEnsure that the assessment is cognitively appropriate for your students.

Ensure that the assessment aligns with your learning objectives.

Ensure that the assessment has clear instructions.
Students spending too little time on an assessmentEnsure that the assessment is not too simple for your students.

Ensure students see the alignment of the assessment to learning objectives.

Ensure that the assessment (and module) fosters value and expectancy.
Students not posting to forums Include an icebreaker or other tool at the beginning of your course to help build an online community.

Revise your module introductions to ensure students know that forums play an integral part in the learning process.

Highlight discussion forum requirements.
Students posting too often to forums Ensure that students’ comments are related to the discussion forum topic.

Remind students of the discussion forum and netiquette rules in your course (either in the syllabus, rubric, or other discussions).

Quiz Reports

If your course uses automatically scoring assessments, the LMS should have data available on how students interacted with those assessments. This is similar to the information you’d retrieve from activity logs. Assessment data will show how long students spent on a quiz, how long they spent on certain questions, how many times they attempted a quiz or question, and which incorrect answers they selected instead of the correct one. Using this data can help determine if assessments are too difficult or too easy.

According to Dick, Carey, and Carey (2015), one of the most crucial pieces of information in instructional design is an understanding of learners’ skills, preferences, and attitudes. One of the challenges to gaining this understanding, whether online or face-to-face, is identifying the right level of challenge throughout the course.

Finding this “sweet spot” of challenge helps keep students motivated in your course. Too easy, and your students’ sense of value in the course material will plummet. Conversely, an overly difficult course may discourage students and cause them to disengage.

Your auto-graded quizzes can give you a snapshot of students’ perception of difficulty. Accordingly, if you see that students progressed through a quiz too quickly, consider revising questions to increase the difficulty (while remaining committed to the learning objectives, of course). Alternatively, if students struggled with one quiz, take a look at the questions they got wrong to better determine how you can ease the process for them.

Rubric Analysis

In their review of rubric use in higher education, Reddy and Andrade (2010) found that students who have rubrics to guide their work experience higher achievement. Rubrics ensure consistency and objectivity, and they help communicate assignment expectations clearly and concisely.

When examining how you can better meet students’ needs in your online course, take a look at your students’ performance based on these rubrics. If students consistently score low in certain criteria, what resources can you add or modify to help them? Or, put another way, how can you reshape your instruction to better prepare them to succeed? When revising your rubric, consider the following:

  • Make sure the criteria align with the objective and assessment. We know it’s important for our objectives, assessments, and instruction to align with one another. It’s equally important, though, for your rubrics to align with these. After all, you use rubrics to evaluate whether your students have met the learning objectives! With that in mind, look closely at your criteria. Do the verbs align with those in the objective and assessment? If not, you might need to revise the assessment, objective, instruction, or even the rubric itself to better reflect the entirety of the process. Remember, solid course design depends on alignment.
  • Make sure students have the appropriate resources available to them. If students consistently score low in a particular section of your rubric, consider including additional resources in your course. Revising your learning materials, adding supplemental resources, and even developing practice activities can help your students master these skills ahead of the assessment.
  • Make sure you have clear criteria. Although students should have access to your rubric ahead of time, it doesn’t do much good if they don’t understand the criteria. When reviewing your criteria, make sure your expectations leave no questions. Students should be able to look at each level and know what they have to do to meet that level’s criteria. For instance, the criterion “Essay contains few grammatical and/or spelling errors” could be clarified as “Essay contains 3–5 grammatical and/or spelling errors.”

Conclusion

No matter how you collect data to revise your course, use it with your students in mind. Any change you make can potentially increase student success, but change for the sake of change – instead of change that’s rooted in your course’s learning objectives – can hinder your students’ learning experience. When looking at student evaluations, activity logs, quiz reports, or your rubrics, ask yourself, “How does this information illustrate how I can better set up my students for success in my course?” The answer to that question will ultimately help you determine how to revise your course.

References

Black, E. W., Dawson, K., & Priem, J. (2008). Data for free: Using LMS activity logs to measure community in online courses. The Internet and Higher Education, 11(2), 65–70.

Dick, W., Carey, L., & Carey, J. O. (2015). The systematic design of instruction. Boston, MA: Pearson.

Hung, J., & Zhang, K. (2008). Revealing online learning behaviors and activity patterns and making predictions with data mining techniques in online teaching. MERLOT Journal of Online Learning and Teaching, 4(4), 426–437.

Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35(4), 435–448.

Posted September 26, 2019
Author Adam Shaw
Categories Pedagogy