Toolkit

The resources highlighted below are a curated sample of SoTL Research assessment tools – ways of quantifying elements of student experience. We do not present these as an exhaustive list; rather, we hope that faculty will use this page as a starting point and explore the literature to identify the tools that are most relevant to their research questions. 

Adapted from Yen et al. (2010)*, this survey addresses attitudes about novel technology across three dimensions: (1) impact, (2) perceived usefulness, and (3) perceived ease of use. Students rate each statement on a Likert scale from 1 (Strongly Disagree) to 5 (Strongly Agree). It is useful for comparing the implementation of a new technology relative to some traditional (but still technology-based) alternative. For example, one might use it to compare virtual reality-based class sessions to Zoom-based class sessions. This template version refers to [classroom context] (for example, recitation sections) and [discipline] (for example, biochemistry); these placeholders should be filled in as appropriate for the particular implementation. The technology is intentionally referred to generically as “technology” so that the same survey version can be used by all respondents. The final section, “Additional Items”, is a set of optional items that may be useful to include (but that were not part of the original instrument); it includes a verification that students did not perceive the instructor to be more invested in teaching in one modality (i.e., with one technology) over the other (item 1) as well as opportunities for free response (items 2-3).


Impact

  1. The technology used during my [classroom context] would be a positive addition for [discipline] students.
  2. The technology used during my [classroom context] is an important part of meeting my needs related to learning [discipline].

Perceived Usefulness

  1. The technology used during my [classroom context] makes it easier to learn [discipline].
  2. The technology used during my [classroom context] enables me to manage my learning more quickly.
  3. The technology used during my [classroom context] makes it more likely that I can learn [discipline].
  4. The technology used during my [classroom context] is useful for learning [discipline].
  5. I think the technology used during my [classroom context] presents a more equitable process for helping with learning [discipline].
  6. I am satisfied with the technology used during my [classroom context] for learning [discipline].
  7. I am able to learn [discipline] in a timely manner because of the technology used during my [classroom context].
  8. The technology used during my [classroom context] increases my ability to learn [discipline].
  9. I am better able to learn [discipline] with the technology used during my [classroom context].

Perceived Ease of Use

  1. I am comfortable with my ability to access the technology used during my [classroom context].
  2. Learning to operate the technology used during my [classroom context] is easy for me.
  3. It is easy for me to become skillful at operating the technology used during my [classroom context] app.
  4. I find the technology used during my [classroom context] easy to operate.
  5. I can always remember how to operate the technology used during my [classroom context].

Additional Items

  1. My professor made an effort to effectively implement the technology used during my [classroom context]. (Rated as above)
  2. What aspects of your [classroom context] were most useful or valuable? (Free response)
  3. How would you improve your [classroom context]? (Free response)

The MSLQ was developed by Pintrich & De Groot (1990) to measure the types of learning strategies and academic motivation used by college students. It consists of 44 items, and students are asked to rate each statement "based on your behavior in this class" on a scale from 1 ("not at all true of me") to 7 ("very true of me"). The questionnaire features the following subscale structure:

  • Motivational beliefs
    • Intrinsic value 
    • Self-efficacy
    • Text anxiety 
  • Self-regulated learning strategies
    • Cognitive strategy use
    • Self-regulation

The first link above presents the student-facing version, and the second link (the paper itself) includes an appendix that sorts the items by subscale and denotes the reverse-scored items. 

VALUE rubrics, published by AAC&U, are open educational resources that enable educators to assess students’ original work. The publisher offers a proven methodology for applying the VALUE rubrics to evaluate student performance reliably and verifiably across sixteen broad, cross-cutting learning outcomes. The rubrics address the following domains:

  1. Civic Engagement - Local and Global
  2. Creative Thinking
  3. Critical Thinking
  4. Ethical Reasoning
  5. Foundations and Skills for Lifelong Learning
  6. Global Learning
  7. Information Literacy
  8. Inquiry and Analysis
  9. Integrative and Applied Learning
  10. Intercultural Knowledge and Competence
  11. Oral Communication
  12. Problem Solving
  13. Quantitative Literacy
  14. Reading
  15. Teamwork
  16. Written Communication

The CCI (Fraser & Treagust, 1986) gives instructors insights about the extent to which a classroom is constructive and positive and, therefore, conducive to learning. Furthermore, the CCI assesses student preferences about classroom climate, which can provide a roadmap for necessary changes. The questionnaire is designed for use in small classes or seminars rather than lectures or laboratory classes.

 

COPUS – a product of Carl Wieman's Science Education Initiative – was created to facilitate the collection of information on the range and frequency of teaching practices in STEM fields at department-wide and institution-wide scales (Smith et al., 2013). COPUS allows observers, after a short 1.5 hour training period, to reliably characterize how faculty and students are spending their time in the classroom.

SALG (see Scholl & Olson, 2014) is a flexible tool that measures the degree to which a course helps students achieve learning objectives. SALG invites students to reflect on their own learning and how it relate to specific elements of the course.

The instrument includes five overall questions:

  1. How much did the following aspects of the course help you in your learning? [insert aspects of interest]
  2. As a result of your work in this class, what gains did you make in your understanding of each of the following? [insert concepts of interest]
  3. As a result of your work in this class, what gains did you make in the following skills? [insert skills of interest]
  4. As a result of your work in this class, what gains did you make in the following? [insert attitudinal issues of interest]
  5. As a result of your work in this class, what gains did you make in integrating the following? [insert sub-items that address how students integrate information]

Read more about about the development, validity, and reliability of SALG here